Managing the vast quantities of data born into existence by the Dodd Frank Act and related regulation will present a challenge in the post-DFA environment; but collecting and producing the required data is just the tip of the iceberg. The ability to analyze and act on that data is what will separate the survivors from the winners. This is already true in many other parts of the global financial markets, but the complexities inherent in swaps trading coupled with the speed at which these changes will take place creates unique challenges. Spread this across all five major asset classes and three major geographies, and the complexities become more pronounced.
Margin calculations are proving to be one of the biggest concerns for those revamping their OTC derivatives infrastructure. In a non-cleared world, dealers determine collateral requirements for each client and collect variation margin on a periodic schedule—in some cases once a month, and in other cases once a year. When those swaps are moved to a cleared environment, margin calculations will need to occur at least daily. The result is an upgrade of the current batch process with dozens of inputs to a near-real time process, with hundreds of inputs. Whereas before major dealers could perform margin analysis, client reporting and risk management in a single system, those systems now need to operate independently within an infrastructure that provides the necessary capacity and speed.
The trading desk will require a similar seismic shift, as flow businesses will provide liquidity across multiple trading venues to an expanding client base. Most major dealers are at some stage of developing liquidity aggregation technology intended to provide a single view of liquidity across multiple swap execution venues. Creating this type of virtual order book requires receiving multiple real-time data feeds and aggregating the bids and offers in real time.
Furthermore, rather than comparing model-derived prices to the last trade price to produce quotes, inputs from SEFs, CCPs, SDRs, internal models, third-party models and market data providers will be required inputs to real-time trading algorithms once reserved for exchange-traded derivatives.
Providing clients with execution services presents other challenges. Executing on multiple platforms also means tracking and applying commission rates per client per venue in real time. Trade allocations also complicate the execution process. In the bilateral world a big asset manager can do a $100 million interest rate swap and spread that exposure across multiple funds as it sees fit. Under the DFA, the executing broker must know which funds are getting how much exposure. Account allocation in and of itself is not new, but cost averaging multiple swap trades and allocating the right exposure at the right price to the proper account presents complex challenges, especially in a near-real time environment.
Risk management, compliance and back-testing data will also require huge increases in processing power, often at lower latencies. Risk models and stress tests, for example, are much more robust than they were before the financial crisis, requiring a considerably higher amount of historical data.
Compliance departments now must store the requisite seven years of data so they can reconstruct any trade at any moment in the past. This is complicated enough in listed markets, when every market data tick must be stored, but for fixed-income securities and other swaps, storing the needed curves means that billions of records must not only be filed away but retrievable on demand. Similar concerns exist for quants back-testing their latest trading strategies: It is not only the new data being generated that must be dealt with. Existing data, too, is about to see a huge uptick in requirements.
In the end these changes should achieve some of the goals set forth by Congress as they enacted Dodd Frank – increased transparency and reduced systemic risk. The road there will be bumpy and expensive, but the opportunities created by both the journey and the destination will outweigh any short term pain.
This perspective was taken from the recent TABB Group study Technology and Financial Reform: Data, Derivatives and Decision Making.
The (DFA) law obfuscates the real issues that created the mortgage crisis and doesn’t require the break-up of the bloated and unwieldy financial institutions that are still “too big to fail”. It makes demands that are incredibly complex to execute and implement while doing very little to address the accountability of those leading the institutions. And, its real goal, wonderously omits how to dissolve Fannie Mae and Freddie Mac; the institutions the namesakes of the law supported with glorious ignorance.
Think about it, Glass-Steagall was 34 pages long and worked for over 65 years. After it was done away with our financial system was brought to its knees in less than 9. Dodd-Frank is over 1,500 pages long, will requires100’s of millions (if not billions) to fully implement and may require years and delays to put into place. Why not simply reinstate Glass-Steagall and relegate this idiot law, Dodd-Frank, to the dustbin of history?