The Consolidated Audit Trail – process, framework, risk – it’s time to get busy!

Industry compliance veteran, James Lamke, opines on the Consolidated Audit Trail (CAT). He offers some practical advice on the framework behind this mammoth challenge that everyone should be contemplating.

Jim Lamke has seen it all in his time – he has more than 30 years of Wall St experience having held some of the most senior operational, but predominantly compliance, roles at iconic firms like Goldman, Bear Stearns, Spear Leeds & Kellogg, and Sanford C Bernstein. He claims he is not an expert on everything, but when he gets going, it’s clear he knows a lot about a lot, and is pretty much unfazed by any new regulatory challenges.

But when you talk to him about CAT, even he seems somewhat concerned by what lies ahead; a number of issues are still not fully defined, nor is the complete depth, breadth and complexity of what needs to be done. His immediate reaction is, “understand your challenges, as everyone’s business model, data and technology structures are different.”

“Most firms I have spoken with, and consulted at, are behind the curve right now to various degrees. Vendors are scrambling to build CAT reporting modules, rebuilding their databases and attempting to revamp their systems to address the various requirements of CAT. It is a massive ask of both the member firms and the vendors, and most of them are working feverishly to add staff and resources to comply.”

He wonders how everything will get done on time, given the complexity of what is being asked, with the US Securities and Exchange Commission imposing strict deadlines and multiple issues that are still unresolved, resulting in some passionate industry concerns. For example, Jim notes, “a number of market centers have generated their own industry market IDs. Based on that, separate Self-Regulatory Organizations (“SROs”) have generated identical MPIDs for different firms. You could have the same MPID for three different firms from multiple SROs, so when you transmit that data to CAT, it is unclear which firm is receiving or placing that order.”

When this issue was identified late last year, firms were informed of it and directed to transmit the CAT Industry Member Identifier (IMID) along with various other data points to reflect the actual identity of the actual IMID with whom the transaction was conducted.

FINRA CAT is now publishing a master list daily, for firms to access and download into their technology infrastructure to accurately identify the IMID. No one had anticipated such an issue until they looked at the data and saw this. So FINRA CAT informed firms that they can retain their MPID but will also need to transmit additional data points to differentiate themselves from others. This is a classic example of the “devil in the detail” within the CAT reporting requirements and there are possibly many more details such as this that we will see arise. Jim summarizes, “FINRA CAT is saying they are on track, but they are the receiver of the data, not the transmitter of the data. The firms are required to adjust their internal data to meet the CAT data reporting format(s) defined by FINRA CAT, placing a lot of responsibility, and arguably cost, on the industry members to adjust their systems and processes to meet the CAT requirements”.

It is hard to not touch upon the potential liability which has been a point of contention between CAT’s administrators and the firms. The CAT reporter agreement’s liability section has liability limited to $500; unsurprisingly firms were not willing to sign this as it left them hugely exposed for any data transfer that goes astray or is hacked. Jim muses, “how do you control access to the data with 3,000 regulatory submitters? Issues such as privacy, liability and cost are three big concerns with the CAT process. Is there too much personal or firm transactional data in CAT? Who will be accepting liability for any losses if this data is improperly accessed by unauthorized parties and utilized to potentially harm the market and market participants?”

The delays, while this has been debated, have meant testing on the first runs have also been delayed. People can now perform testing, but most are not that comfortable on liability, once it goes live. This still needs to be defined and addressed.

Firms already have numerous regulatory reporting obligations such as OATS and Blue Sheets. These processes in place are unlikely to change in the near term.

“The biggest risk firms have is when CAT goes live into production, firms should make sure their CAT-reported data is similar to their Blue Sheets and OATS-reported data. Otherwise the discrepancies in the various regulatory reporting processes could result in additional regulatory inquiries as to the nature and scope of the discrepancies.”

He warns that development and production from one system or process to another, i.e. from development to production, can generate unintended or unexpected consequences. Many firms do not have perfectly mirrored development and production systems and as such it is extremely difficult to reconcile the outputs from the development system with that of the production system. Firms should perform some reconciliation of both OATS and Blue Sheets and other regulatory reporting data already in production to CAT data. This will assist them in identifying if they are missing anything or potentially modifying data points that should not be altered.

Another issue worth mentioning is that many firms use third-party vendor platforms, and their respective Order Management Systems (“OMS”) and Execution Management Systems (“EMS”). When those firms trade and transmit OATS, Blue Sheets, and other regulatory reports, they must be certain their vendor is transmitting all the requisite data on the firm’s behalf. The industry member bears the regulatory responsibility for what is transmitted and for maintaining the books and records concerning those transmissions. They must also monitor and supervise the accuracy and completeness of those transmissions. It would be much simpler if firms just had one OMS and EMS, but that is often not the case, since many have such diverse business models and may require numerous systems and technologies to conduct their business.

If there are multiple levels of systems and technologies within the firm, there might be different hierarchies, structures of data or transmissions across various technologies. This makes bringing this process in-house, and looking at it holistically, much harder. Whether it is FINRA, the SEC, NYSE or any other SRO, any regulator that needs access to the data is going to expect the same information. The challenge is making it one coherent process when multiple ones exist.

In time, OATS and Blue Sheet reporting, and hopefully other regulatory reporting requirements will be “sunset”, leaving CAT as the primary source for this information. This will occur once the SEC and various other SROs are confident in the CAT process and data. CAT reporting will initially have orders, routes and executions. Quotation activities and allocation information will come in the near future. Blue Sheets has allocation data which might not be submitted as retail or institutional allocations until 2022. It may take another few years for the regulators to get comfortable that the data is accurate before OATS, Blue Sheets and hopefully other regulatory reports get sunset.

Many firms are rightly concerned that if they do not get something right, the regulators will have the ability to reconcile their OATS and Blue Sheets submission versus CAT and vice versa. This reflects how accurately firms are transmitting their data; many fear regulators may use this against them. Jim qualifies this, “while this has not been specified, and the SEC, FINRA and SROs are full of smart people with common sense, I believe that there will be a honeymoon period for firms to work out the kinks and the nuances of this whole process, given its complexity.” Under the current testing process, firms must have 90 to 95 percent accuracy for submission for the data points of legitimate values in legitimate schema in order to gain access to the next phase of CAT testing. The step-by-step process set up for this by FINRA makes sense.

“While there may be immediate regulatory findings under CAT, it is likely there will be some ‘taps on the shoulder’ to inform firms that their data points are acceptable, but the future reconciliation process of the data may not be. If the firms can show they are working on this diligently and honestly, they will probably receive some latitude and leniency.”

Do firms need to reinvent the wheel to complete the governance framework? Can they leverage some OATS/Blue Sheets/LOPR process for the new process? It is possible but a number of third-party vendors are rebuilding their CAT database from scratch (no pun intended) as their current database structure won’t work, given the depth and breadth of data required under the CAT process. A new database requires a new governance model and new control frameworks.

Many firms do not have a process in their systems to reconcile data across their entire technology platform and many firms utilize third party vendors who themselves may need to build these for their clients. This is not a switch you can just “flip” one day to work the next. Firms have a variety of different technologies, data structures, processes, connectivity that all need to be reviewed and possibly re-thought. Transmitting the CAT data is straightforward, since it is similar to the current OATS process. Data must be submitted early the next day, but the size of the CAT data will be larger and may require additional bandwidth to transmit. Firms should evaluate how long it will take them to compile this file overnight and make sure that their internal processes can compile and transmit the added data in the requisite timeframe. What if you get a portion of your data from a third party such as an OMS vendor and you also have your own internal data that you might need to put together in order to transmit a complete CAT file? Will you have them in a timely manner from the third party to compile and report for your CAT transmission, and if so, how are you ingesting that data back into your internal systems or processes for a timely transmission? There are so many nuances that need to be worked out and every firm is going to be a little different given their systems and technologies. What data points in your internal systems or within your external system represent the required data structures within the CAT Data Dictionary? All these issues need to be mapped and reconciled.

The discussion moved onto the impact of CAT on smaller BDs and any potential longer-term benefits. Whenever a new regulatory process gets added that involves monitoring and books and records, there are generally extra costs incurred. OATS and Blue Sheets are basic transmissions, based on the data structures and given the length of time they have been in place. CAT is more complex with respect to the depth and breadth of data to be transmitted. Do the firms have the financial wherewithal to build or buy that capability? Larger firms are building it as they have the capital, technology, staff and a better understanding of their internal trading systems, technologies, data, etc. Smaller firms are generally utilizing third party vendors to perform the requisite CAT reporting for them. This will likely become an additional cost for those firms. Smaller firms may be charged additionally for OATS, Blue Sheets and CATs, and there may be an additional cost for creating and managing the technology and connectivity in transmitting the data. They may need additional servers, memory and processes to maintain the data. There may even be additional work with respect to the new regulatory requirements to monitor and surveil the reporting. Jim says,

“Some of the larger firms have hired dozens of people to focus on CAT while others are doing it with as few bodies as possible and are struggling. There will be more costs, but how much is firm-dependent.”

CAT’s evolution, and the breadth and depth of the data, promises to make some in the industry uncomfortable. George Orwell’s 1984 comes to mind – it will not just be order routes, execution and settlement data; there will also be quotation data. Regulators will be able to understand much more about what a firm is doing and how it conducts business. They will be able to monitor both the firm’s and clients’ activities across the market. The way trading takes place now will be visualized much more effectively by the regulators, especially the activity of the bigger institutions with multiple counterparties across the whole market. Eventually, there may be no need to send out individual Blue Sheet requests to every firm that has conducted activity. The data will be there in front of the regulator.

Jim is sanguine about how this new direct transparency will change the role of internal compliance at firms. Many firms will still have manual process requirements from manual trading, and will still need compliance advisory to assist those doing that business, around orders, external routing and proprietary book trading. The nature of the questions may change but there will still be a number of different market structures. For the electronic trading houses, the question will be more whether they do proprietary or agency trading manually, or electronic/algorithmic trading. The nature of the firm’s business activities and its trading activities will determine what types of compliance advisory services are required.

As the shift to electronic/algorithmic trading grows, it will drive a shift in the technical needs for the compliance role. As all this is codified into rules, compliance is still required to ask the right questions on trading methods, intent, structure and the desired outcome in order to determine if this is being done appropriately, and in accordance with the various regulatory requirements. Programmers are not generally responsible for this. They are great at routing different pieces of a transaction to different exchanges and devising a method for trading, but a compliance person should determine whether the firm is buying and selling against itself on a different exchange, which manifests a wash transaction, or routing to the appropriate market centers for execution.

Compliance does need to be more technologically advanced to understand these issues and scenarios, and to be at the table when these structures are discussed and developed. The skill set is changing. It is asking for an understanding of market structure, regulatory structure, data and how it flows, how a firm conducts trading activities and how its systems operate. There is a lot to be conversant with. So much of the role is about data management and being able to understand the data flows through the technologies utilized by the firm and the metadata surrounding that. They must be able to explain what transpired to the business, an auditor or a regulator. Firms must understand and be able to factually convey how their processes work. When confronted by the regulatory staff, the firm can escalate the issue to the regulator for additional guidance and opinions.

“The US may be ahead of the curve right now with this, but the European Securities and Markets Authority, UK FCA, SEBI in India, HK SFC etc. are all augmenting their regulations with respect to electronic and algorithmic trading. This thinking will migrate its way around the world given the potential market impact that electronic and algorithmic trading activities can have on a market”, said Jim. “It will be a challenge to cope with the various regional and international regulatory structures for transmitting this data. The bulk of reporting formats will be similar. But what will be different, will be the regulatory nuances. That will depend on the regulatory environment, country and structure therein. Firms need to identify and acknowledge these nuances in their data structure and transmissions. Different data points will be needed under the transaction reporting requirements e.g. different values for customer IDs in the UK to US. Firms need to understand the differences and similarities in the regulatory schemas around the world”.

James Lamke is a Senior Advisor – Institutional Sales & Trading – Compliance & Regulatory at Capital Forensics, Inc. He has served as a Co-Chief Compliance Officer and Director of Global Trading Compliance at several global institutional services firms, focusing on regulatory and compliance programs. He has designed, developed and implemented surveillance technologies for many of the firms’ activities, including Sales and Trading, Research, Capital Markets, Operations, regulatory reporting for OATS, Blue Sheets, LOPR, CAT, and short-interest reporting, enabling him to provide consultative expertise to firms. Earlier in his career, Jim was with the NASD (predecessor to FINRA) as an Analyst, and then an Investigator in Market Regulation.