Trends in Alternative Market Data
Half of investment firms using alternative market data for investment strategies
Market data is now the life blood of trading activities. Historically an arena of the digital infrastructure landscape that was dominated by large incumbents, alternative market data vendors are obtaining market share by focusing on the niches for direct exchange, low-latency and ultra-low latency feeds. According to Builtin, roughly half of investment firms use alternative data to forge investment strategies.
Alternative data typically refers to data used within the financial services industry that is gathered from non-traditional or unofficial sources, such as company filings and broker forecasts.
In 2024, GreySpark believes that the core elements of an alternative market data service offering are:
● Real-time Normalised Market Data – Aggregated from co-located and non-co-located exchanges, normalised and redistributed for use by e-trading systems; and
● Historical Market Data – Captured from a ticker plant to provide a historical view on prices for purposes related to benchmarking, reporting, and the back-testing of algos and models.
Across the capital markets data-as-a-service (DaaS), data marketplaces and alternative trading venues are three of the most impactful data trends to have emerged in the last decade.
DaaS models have been growing in importance for financial firms, due in part to their ability to resolve historical bandwidth issues. Roughly half of financial services firms’ data is now stored in the cloud. Market participants can access market data directly from data vendors via DaaS plugins, reducing server-related limitations. Financial services firms are increasingly embracing DaaS models, while key market data providers and exchanges are choosing to offer DaaS plug-in solutions and centralised data distribution models.
In addition, there has been a notable rise in the use of data marketplaces over the last 10 years. Data marketplaces bring together data providers and data consumers, facilitating the buying and selling of data. Providers generate revenues from selling their data, while consumers can get access to specific data that suits their needs, creating a win-win for both parties. In the early days, data marketplaces focused on intraday and historical data. Today, more data marketplaces offer real-time – or close to real-time – market data too. A significant advantage of using a data marketplace is that it will often provide standardised data to its clients.
While larger financial firms can trade on exchanges, smaller companies find it challenging to become exchange members and thereby gain the option to trade on the venue itself without requiring an intermediary to make their trades. Consequently, there has been a rise in alternative trading venues which allow direct market access for these smaller players. Entities such as The Better Alternative Trading System (BATS) have purchased seats across various markets, enabling smaller traders to access these markets directly through BATS. Another advantage of this arrangement is that the smaller firms can purchase market data from the venues on which BATS, or the like, facilitate their direct access. Despite having the option to purchase market data across a multitude of channels, financial firms still control the data that enters their systems. The large amount of data they can purchase in various formats, however, makes ensuring data quality challenging. In many cases, a lack of market data monitoring causes duplications, extended latencies, and ultimately higher costs for the firm as it troubleshoots the consequential issues.
With most financial products trading electronically, latency now tops the agenda for proprietary traders and market-makers – regardless of asset class – when engaging with digitalised infrastructure for the first time. Latency is the total elapsed time between the event, or signal, that triggers a trading decision and the actual completion of that trading action. The origins of low-latency trading are traceable to the arbitrage desks of the 1990s, where speed of markets connectivity was directly used to ‘pick off’ mispriced products.
In 2024, equities markets, globally, require low-latency trading capabilities because of the widespread use of algorithms and automated trading tools by different types of market participants, while fixed income and FX markets are best described as ‘latency sensitive’ as opposed to ‘latency dependent’ due to the extreme liquidity of some instruments or products and the tightness of bid-offer spreads in quote-driven brokerage environments. Outside of those markets, ultra-low latency is typically considered nice-to-have by most markets participants as opposed to being an imperative for a competitive trading franchise.
Capital markets trading is now considered by all industry participants as a global, 24/7 business model. Many organisations built complex webs of network infrastructure and large data centres to support them, and – as a result – the ability of a firm / institution to leverage dedicated data centres, co-located physically together with various execution venues, using low-latency networks is commonplace.
Since the advent of electronic exchange platforms in the mid-1990s, the latency thresholds under which trading technology must perform to stay competitive are halving roughly every three years, and those thresholds are now at levels expressed in tens of milliseconds or less.
On exchange platforms, latency is generated at three levels:
1. The applications that interpret financial information and make trading decisions, equating to approximately 65 per cent of total latency;
2. The middleware that distributes the various messages and signals, equating to approximately 25 per cent of total latency; and underneath all
3. The network fabric that transports data from a market to the trading firm and then back to the exchange, equating to approximately 10 per cent of total latency.
Market data latency can lead to traders being on the wrong side of the market. If the order message that is routed back to the broker’s server gets delayed, the price that the trader attempted for the order may have moved away. Potential consequences of this delay could be that the broker may not fill that order as a better price can be found elsewhere, or the broker does fill the order but the price shown on the trader’s screen may not be the price at which the trade was executed.
According to one report, the estimated value of a one millisecond latency advantage for a brokerage firm amounts to more than USD 100 million annually. As such, the importance of reducing latency to match or improve on the latency experienced by other market participants cannot be downplayed. With more data alternatives coming to the fore, managing network latency across different platforms is becoming increasingly complex, but at the time taking on greater importance with more data coming from alternative sources.
For further information, please do not hesitate to contact us at london@greyspark.com with any questions or comments you may have. We are always happy to elaborate on the wider implications of these headlines from our unique capital markets consultative perspective.