The increasing digitalisation of capital markets infrastructure is opening up a variety of data channels, including data-as-a-service plug-ins, data marketplaces and alternative trading venues, leading to a data deluge that is putting financial institutions (FIs) systems and operational processes to the test. In particular, it is exacerbating the problem of data silos for some FIs, who are scrambling to consolidate their infrastructures and rethink their data management protocols in order to deal with the increasing data volumes.
For example, certain types of data, such as OTC derivatives trade data, are typically processed manually or semi-manually using a patchwork of outdated technologies and processes. The processing of the data is therefore cumbersome and prone to human error. FIs typically use customised, makeshift systems originally intended for other functions to process some types of data, leading to inefficient data operations. Gaps are often plugged using spreadsheets, which can lead to operational risks. Inevitably, these measures are only viable up to a certain point. If data volumes surpass a certain threshold, such as in voluminous cash trades data, a more capable data processing solution is required to handle the increased volumes.
In particular, in handling high-volume data, multiple teams within a firm need to manage and have access to the data when carrying out essential tasks such as reconciliations. However, due to data silos, this work is carried out inefficiently and repeatedly, leading to duplicates and cost inefficiencies. In fact, according to Data Dynamics, data silos cost the global economy $3.1 trillion annually. In addition, Harvard Business Review anticipates that data silos can increase business costs by as much as 80%.
As well as reconciliations, another issue that data silos present is data preparation. Data preparation, for activities such as scenario testing and meeting regulatory requirements is often extremely time consuming, however, because it has to be gathered from data silos with different business departments and cleansed, validated and wrangled into a usable format. Given the amount of effort needed to prepare the data, firms should consider outsourcing this process. If data is prepared in-house, the financial firm may not have the resources to process the datasets.
In order to address the data issues outlined above, firms should consider reducing the number of platforms and the number of processes they use in their operations. The most obvious way in which to do this is consolidating operations into a cloud-based vendor solution that is capable of handling significant data volumes. In doing so, this takes much of the pressure off support teams and allows them to focus on core business tasks. Of course, the drawbacks of this will need to be considered, with the main ones being limited control and flexibility of data processing and vendor lock-in.
In addition, a typical traditional on-premises server room can strain under the weight of surging trade volumes and large complex data sets. This overload can cause legacy systems to buckle, causing downtime, missed trading opportunities and disastrous reputational consequences. Managing failing physical infrastructure is also costly and cumbersome.
As GreySpark observes, cloud migration offers trading firms scalability, flexibility and cost-effectiveness, though the cost issue is now considered to be less important than other needs such as high-performance computing, low-latency networks or advanced security features. Benefits that can be realised in a cloud environment include:
Dynamic Resource Allocation: Cloud solutions enable trading firms to adjust their IT resource may usage based on their needs in real time. Depending on market conditions, they can quickly scale up or down without investing in expensive on-premises infrastructure.
Hybrid and Multi-Cloud Strategies: By using multiple cloud providers or a mix of cloud and on-premises solutions, trading firms can improve redundancy and ensure uninterrupted operations. This is particularly important as downtime can result in significant financial losses for trading firms.
Containerisation: Container technologies, like Docker, allow trading firms to package applications and their dependencies into standardised units, making them easier to deploy and manage across different environments.
Microservices Architecture: This approach involves breaking down complex applications into smaller, independent services that can be deployed, scaled and updated individually. This enables trading firms to develop, test and release new features more rapidly, providing them with a competitive edge.
Crucially, migrating data processing systems to the cloud can go a long way in tackling a firms’ inadequate and piecemeal data processing systems and allow capital market participants to deal with higher data volumes.