Challenges of Digital Twin Implementation
Why a digital twin data strategy is crucial to successful implementation
Hello everyone and welcome to the latest edition of GreySpark Insights.
Please do not hesitate to contact us with any questions or comments you may have. We are always happy to elaborate on the wider implications of these headlines from our unique capital markets consultative perspective. Happy reading!
Digital twin models are heavily reliant on the accuracy of the data inputs. If incorrect or old data is used in the digital twin model, it will produce unreliable results, potentially leaving the financial firms’ systems vulnerable to risks that will not be properly addressed or mitigated. Ultimately, successful digital twin implementation depends on capturing specific parameters that eventually help calibrate performance and design improvements, and this is impossible without high quality, real-time data. Data preparation is often extremely time consuming, however, because it has to be gathered from data silos with different business departments and cleansed, validated and wrangled into a usable format.
Given the amount of effort needed to prepare the data, firms should consider outsourcing this process. If data is prepared in-house, the financial firm may not have the resources to process the broad and deep dataset required to create a digital twin model. Consequently from a quality control and resourcing standpoint, it is a challenge for financial firms to implement a digital twin model by themselves. Additionally, although digital twins can help identify cyber / security threats, they can themselves still be susceptible to a security breach if not carefully overseen. Bad actors could create an almost identical yet inconspicuous model of the digital twin, insert it into a production environment in order to inject malware into the ecosystem or steal data.
The successful implementation of a digital twin model requires firms to have already created a well-defined data strategy plan tailored to their firm’s unique specifications. The figure below depicts a model digital twin data strategy plan.
Source: Nvidia and GreySpark analysis
Given that digital twins consist of highly sophisticated hardware and software, the development, implementation and maintenance of a digital twin is typically highly complex. During the initial phase – digital twin data strategy – the firm may reach out to reputable, compliant third party subject matter experts with experience of implementing digital twin models. Failure to properly execute a digital twin model can lead to an unnecessary strain on resources and higher implementation and running costs, while leaving the firm vulnerable to operational risks. In many ways, poorly implementing a digital twin framework may do more harm than not implementing one at all.
Crucially, failure to develop a robust and agile digital twin model could mean that financial firms will fail to achieve operational resiliency compliance – especially as existing regulatory frameworks, such as the Digital Operational Resilience Act (DORA), continue to evolve and become more nuanced. The importance of successfully implementing a digital twin framework cannot be downplayed.