The Joy of Seeing Double with Digital Twins

We live in strange and curious times. The unpredictability of the past year seems likely to continue for some time to come. Consequently, as planning for 2021 gets into full swing, coping with uncertainty is at the forefront of strategy in all organizations. It is no surprise, then, that investment in digital transformation is focusing increasingly on enhancing experimentation, building resilience in operational capabilities, and speeding up decision making.

In many domains, a key area of interest to help deal with uncertainty is using a “Digital Twin”. It is an idea that has been discussed for some time. But not without a suitable amount of confusion. Isn’t it just another name for “a model”?  Perhaps.

Models of the real world have always been used to understand, learn, and predict. For several decades, descriptive, analytical, and executable models have been essential tools for organizations. And for at least the last 20 years, of course, these models have been created, communicated, and analyzed digitally. For much of my early career I was deeply involved with describing complex systems as digital models using formal specification languages such as Z, and described with graphical design notations such as the Unified Modelling Language (UML). Such models were essential in understanding and reasoning about system properties, but more importantly they often became the basis for generating executable models in software.

We must accept, however, that models are a simplification or approximation of reality. So, a critical aspect of the value of modelling is the accuracy and fidelity of the model. Does it offer a realistic representation of the real-world it is intended to describe? Can that model maintain its connection to the real-world? Digital twins aim to address these concerns. By tying together digital models with real-world data, a more informed, reliable view into the system under review is possible. In particular, with recent wide deployment of sensors and IoT devices in homes, offices, factories, and elsewhere, a constant stream of real-time data can be used to validate the model’s accuracy and help to refine it to maintain the connection between the model and the real-world it represents.

Much of the use of digital twins occurs in situations where there is a need to understand or predict behaviour of complex physical systems. For example, in areas such as construction, manufacturing, and engineering, digital twins help manufacturers and engineers accomplish a great deal, including:

  • Visualizing products in use, by real users, in real-time.
  • Building a digital thread, connecting disparate systems, and promoting traceability.
  • Refining assumptions with predictive analytics.
  • Troubleshooting distant or inaccessible equipment.
  • Managing complexities and inter-relationships within multi-layered systems-of-systems.

very good recent report from Arup provides a useful framework for understand digital twins are their uses. Offering an historical perspective on their evolution, the report provides several examples of how the concept has been applied in manufacturing, engineering, construction, aerospace, defence, and elsewhere. Its definition of digital twin is a helpful summary of why they are important to our digital future:

A digital twin is the combination of a computational model and a real-world system, designed to monitor, control and optimise its functionality. Through data and feedback, both simulated and real, a digital twin can develop capacities for autonomy and to learn from and reason about its environment.

All of these characteristics of digital twins raise the opportunity to learn more about systems in design and in use. Something we will all find useful in the unpredictability of 2021 and beyond.

Source: AWB Digital Economy Dispatch #18

Alan Brown

Alan W. Brown is Professor in Digital Economy at the University of Exeter Business School where he co-leads the Initiative in Digital Economy at Exeter (INDEX). Alan’s research is focused on agile approaches to business transformation, and the relationship between technology innovation and business innovation in today’s rapidly-evolving digital economy. After receiving a PhD in Computational Science at the University of Newcastle-upon-Tyne, Alan spent almost 2 decades in the USA in commercial high-tech companies leading R&D teams, building leading-edge solutions, and driving innovation in software product delivery. He then spent 5 years in Madrid leading enterprise strategy as European CTO for IBM’s Software group. Most recently Alan co-founded the Surrey Centre for the Digital Economy (CoDE) at the University of Surrey where he led research initiatives in 4 EPSRC-funded research projects.

Explore more content

Improving in the moment

Seeing the opportunity
Read more

Being an Agile developer

Knowledge workers are key to Agile <
Read more

Unit and Integration Testing Overview

The two important types of testing <
Read more
Contact Us