Everything You Always Wanted to Know About Algorithmic Trading Compliance (But Were Afraid to Ask)

Published On January 28, 2019

Algorithmic trading is white hot in financial markets as the availability of more data combined with better technology allows firms to lean on machines to provide auto-execution as well as new investment decision-making capability.

Automation brings significant benefits in terms of cost reduction and faster execution, but as is always the case, speed increases the risk if something starts to go wrong, putting the onus on compliance and risk functions to keep pace with the new technology and approach, while regulators also attempt to play some degree of ‘catch-up’.

Algo compliance has been on the UK Financial Conduct Authority (FCA) hitlist for some time, following a paper in February, and as Charles Mo, managing consultant at GreySpark Partners told Radar, there is currently no standard approach or method in determining how to manage algorithms to reduce the risk of disorder in the markets.

His advice to firms is to understand the regulatory obligations and set up scenarios which include best practice.

“Before starting to specify technical solutions, it is important to define the scenarios that are to be monitored and then look at the systems that can achieve this,” he said.

“There is a realization that lexicon-based systems are not scalable and are costly, hence causing much effort to be spent on managing and suppressing the false positives.”

He said stakeholders are now toying with the idea of using Artificial Intelligence and Machine Learning (ML) to replace rigid lexicon systems in an attempt to improve the systems, “since the regulations don’t specify the systems or approach required to do trade surveillance”.

In this method, scenarios are defined and then the system refines and improves itself through ML to ensure a maximum hit rate, he said. “AI allows a much larger diversity of data and information to be evaluated than the pure lexicon,” he said.

By nature, algorithms are deterministic and markets are volatile, he said, therefore, there is a degree of randomness involved. “Nevertheless, the output is much more scientific,” he said.

The FCA and UK Prudential Regulation Authority are both demanding accountability in developing and changing algorithms, and both institutions have paid visits to quant teams of Tier 1 banks, starting from a low bar in terms of best practice, said Mo.

The volume and business flow in algorithm trading are currently huge, resulting in great market risk, said Mo, and as a result, the regulators want to audit algorithm stress testing, and its management, with evidence of stringent processes to manage risk to market and the business.

“This whole process is not rocket science; it is just about having an effective process for the algorithm development lifecycle which is documented and evidenced,” Mo told Radar. “The banks are expected to take the initiative and make this a priority.”

He said the stress testing required by the regulators is not currently too high in terms of demand, which should provide some comfort when considering that many already do a performance benchmarking test.

“It is important to understand how the algorithm is architected and integrated,” he said. “So in essence, it is a systems stress test where you are looking for the weakest link, and you benchmark the systems and limits accordingly. The kill-switch and procedure must be able to accommodate twice the usual market rate – that is the testing standard – then you can start to think about monitoring.”

It is key, he said, that the algos can operate in a similarly stressed environment which is end-to-end and not a unit test view.

In his view, the regulators are not looking for complex technological solutions, just common sense best practices, and he believes for the most part the industry is getting it right, but this may change rapidly. “In future, the more advanced monitoring systems will be useful if you want to develop new strategies, and this can help to differentiate the banks,” he said.

Next year, in order to organize the governance around risk management in algorithm trading, there will be a separation of the duties in operational first and second lines, said Mo.

“This will result in a governance framework around those working in the first line, who are running the day-to-day operations of the banks, and implementing the controls and limits and ensuring systems compliance, in other words, the front office quant IT teams.”

The organization should be set up in a way that allows different people to be accountable, he said, so the second line sets processes and policies and the first line derives procedures on how to put in controls, and ensure that they meet the regulations.

“The outcome is different types of ownership,” he said. “MiFID II discusses how staffing should ensure that recruits have the right skills and training. That transaction reporting is the responsibility of the operational team that works for the product manager, and the technical staff behind it are accountable to the IT managers for the running of those systems.”

“This governance framework will not only include algorithms but trade transparency, best practices, and record-keeping, ensuring the algorithm team or head of product can own a library of algorithms within their domain.”

The introduction of the accountability guidance in the UK Senior Managers Regime (SMCR) will put more detail and emphasis on establishing clear accountability and delegated authority around algorithmic compliance, as currently only the chief risk officer or head of trading desk have to be registered. “The SMCR is more draconian so that it mandates delegated authority down the chain,” said Mo.

When looking at the delivery lifecycle of the algorithms, the regulators are demanding each piece from development to testing to production be segregated and independent, Mo said. There must be responsible individuals recognized for that quality; when a concept or change is developed, before a single line of code is written, it needs to be documented and signed by a senior manager before it goes into development.

“Currently, this is dealt with inconsistently; some banks do and some don’t, so there is a need to find a head of product to sign off on all changes before they commit resources to build.”

The necessity for a very clear definition of roles becomes critical, allowing for responsibilities and the capacity to delegate to individuals who are then registered with FCA. There aren’t many who are comfortable with this new responsibility (such as IT staff), Mo said.

“Much comes down to common sense and due diligence; what regulators really want is clear documentation and evidence of gathering issues that have happened, and also how you tackle them with an audit trail, good records around algorithms, change management that is consistent,” said Mo. “They want continuous compliance, and the best way to evidence this is to constantly log changes to challenging issues that might be violations, testing procedures, putting action plans in place and complementing all this must be a nice paper trail for FCA to come and inspect.”