In the world of fin tech, data capture has never exactly been the sexiest area of development.
That title tended to be reserved for areas such as low latency solutions, analytics, or even risk management. But as market participants ready themselves for a new wave of trading rules, and as competition among cutting edge trading firms has shown no sign of abating, the importance of data capture has taken on much greater significance.
Whether a firm wants to optimise trading strategies or ensure that it complies with myriad trading regulations, having complete, accurate and easily retrievable trade-related data has become paramount. What is different is that the amount and granularity of data requirements has grown exponentially.
For instance, MiFID II and MiFIR (the Markets in Financial Instruments Directive II & Regulation) mean regulators will want more than just basic trade data. They may want all the related quote data, and they will want everything time-stamped. This is for OTC bilateral trading, not just exchange-based trading, and it covers the spectrum of asset classes. The good news is that modern packet capture technology is able to meet these needs.
Why did data capture technology get so much better? A good part of that has to do with the rise of algorithmic trading. As the race-to-zero-latency intensified, firms were finding their capture mechanisms could not keep pace with their network speeds. But in recent years, particularly in Europe, the focus has been on meeting regulatory rules.
In both cases, the need for zero packet loss is the first requirement. If you don’t have all the data, how good will your back-testing be and how will your analysis show where bottlenecks are occurring? Or, what would you say to the regulator asking for a complete report on all the activity surrounding a trade? For instance, a key focus in MiFID II concerns best execution. The only way to determine best execution is to consider all available quotes at exactly the times they were made.
It’s possible for some firms to develop their own data capture mechanisms. Using FPGA technology, a firm can put a capture card on a server and grab all the data that flows across that part of the network. But the data needs to be captured in a meaningful way.
That’s where firms such as Napatech come in. They’ve made a business out of capturing the data throughout a network using Pandion network monitors and ensuring 100% capture. But it is not just a question of having complete data. It needs to stored so that it can be retrieved – and quickly.
Napatech in this case has teamed up with firms such as Instrumentix, which specialises in translating network data so that it be meaningfully interrogated, and Stream Financial, which can provide analytics solutions based upon federated queries across multiple data sources.
It all starts with capturing the data. But if firms don’t capture the data completely and in the right way, it could end there too – and not in the way they envisaged.