The Anatomy of a Swaps Trading Infrastructure

By | August 30, 2011

For the biggest swaps dealers, creation of their new OTC derivatives infrastructure will include rebuilding existing platforms, buying key elements from technology providers, leveraging technology already in place in other asset classes and, of course, building new platforms from scratch. This is not a buy-versus-build decision—it’s a careful balancing act of process and technology decisions to create a best-of-breed infrastructure.

A firm-wide data warehouse is the first step. Each business silo now has its own approach to collateral, valuations and even reference data. If new platforms are developed to work with existing data stores by normalizing the inputs, a huge opportunity will have been wasted. Just as the US government should have combined the CFTC and the SEC when they had the chance via the DFA, financial firms should use the occasion of a huge and necessary overhaul as a time to clean up legacy data inconsistencies that have festered for years.

On top of the data warehouse sits the data mart, which will give the individual businesses the appropriate view of the data. Once relegated to operations, the data mart approach must now be brought to the front office. The quantity of data needed to trade and the speed at which that data must be consumed is well beyond where it is today. Therefore, the trading desk must be able to quickly access and work with a huge store of data. The key elements needed to create this front-office data mart are data consumption, data storage, data retrieval and data distribution.

Data consumption and distribution both rely on messaging middleware. High-speed listed markets have driven middleware providers to perfect the paradigm of ingesting and distributing millions of messages a second at ultralow latencies without missing a single message. In this instance, latency isn’t about measuring microseconds; however, moving from minutes and hours to a few seconds or less is a huge paradigm shift. To that point, nearly every market participant TABB Group spoke to while researching Technology and Financial Reform: Data, Derivatives and Decision Making agreed that latency would become a much bigger issue in swaps trading as the market evolves.

Between the receipt and distribution of data sits the process of data storage and retrieval. Storing a string of market data ticks in a flat file might be suitable for high-frequency equity traders’ back-testing strategies, but in the swaps market more sophisticated methods must be used. Traditional relational databases also can’t cope with the volume and complexity. Paradigms such as MapReduce and Hadoop, those that have allowed search engines to fulfill your search request in milliseconds, operating on massively parallel processing (MPP) servers must now be used to find “the truth” in terabytes of swaps market data.

Of course, not all data sources will be needed on a real-time basis. In these cases, traditional databases may very well stand the test of time. However, these databases will act as inputs to the high-speed data marts sitting up next to the trading desk. This takes us back to messaging. The OTC derivatives data problem requires tightly integrating the high-power machine with the distribution mechanism so data can seamlessly flow in, be analyzed and flow out around the organization.

Ultimately the technological implications of financial reform are as much about each firm’s ability to refocus and realign functionality as they are about volumes of data. Moving both the data and process from a vertical to horizontal alignment will allow these businesses to scale as regulations and the markets evolve; these investments can be made now, even in the absence of regulatory certainty.

This perspective was taken from the recent TABB Group study Technology and Financial Reform: Data, Derivatives and Decision Making.

Have something to say about what I wrote? Please comment...