Skip to main content

Unlocking the Promise of Digital Twins

Feature Story

Digital Models
Digital Transformation
Digital Apps and Services

By Solmaz Barazesh Spence

Last update June, 10 2024

Silhouetted profile of a person overlaid with digital medical and technology graphics, including circuits, data lines, and health symbols on a blue background.

Imagine a hospital where a doctor is reading test results from two cancer patients. One is a 62-year-old woman, who sits in the exam room waiting to discuss her treatment options. The other is her virtual representation — a set of simulations and models that mimic the patient and the tumor. Together, the real-world patient, her virtual counterpart, and the flow of information between the two form a system called a digital twin.

Results from imaging or lab tests on the patient update the virtual representation, which feeds into simulations of how the patient’s body might respond to different potential therapies. Doctors could use such insights to minimize invasive testing on already weary and sick patients, and to develop personalized care plans that avoid unnecessary treatments — simultaneously making patients’ lives easier, optimizing the outcomes, and reducing health care costs.

Over the past several years, advancements in digital twin technologies — which use modeling and simulation to create a virtual representation that mimics the structure, context, and behavior of its physical counterpart — means this scenario is coming closer to being realized. Going beyond traditional simulation and modeling, digital twins feature bidirectional feedback — think of it as a continuous back and forth chatter — between their virtual and physical components.

The demand for using digital twins to support critical decision-making is growing across many domains. For example, a digital twin of a city’s transportation network could help reduce traffic congestion, predict the effects of adding a new bus route, and help guide future infrastructure investments. A digital twin of a coastal community could help emergency planners and citizens understand how climate change may affect storm severity, and guide resilience efforts such as building hazard mitigation infrastructure and putting disaster management strategies in place, allowing the community to recover faster.

Before these possible applications can become a reality, though, decision-makers need to know they can rely on the outputs of digital twins, particularly for high-consequence, mission-critical, and safety applications. Implementing these technologies responsibly will require an integral focus on establishing trust and credibility, according to a report released in late 2023 by the National Academies titled Foundational Research Gaps and Future Directions for Digital Twins.

“Digital twins have great promise in bringing value across areas of science and technology, including engineering, the natural world, and medicine,” said Karen Willcox, director of the Oden Institute for Computational Engineering and Sciences at the University of Texas at Austin, and chair of the committee that wrote the report. “There are serious research questions to tackle, and any responsible development of digital twin technologies must maintain an integral focus on establishing and maintaining trust.”

Assessing the Reliability of Digital Twins

The virtual representation in a digital twin is dynamically updated as new data are gathered in the real world — for example, from sensors, clinical assessments, or remote sensing. Predictions and simulations from the virtual representation drive changes in the physical counterpart, whether to it adjust the dosage of a patient’s medication or turn on an additional weather drone sensor.

In some cases, humans would take control of the decision-making that leads to those real-world changes, such as when a doctor chooses a specific treatment plan using the information that a digital twin has provided. In other cases, some systems could be fully or partially automated, such as a digital twin of an aircraft deciding to reposition a sensor on an airplane wing to improve data quality.

One concrete way to assess the reliability of digital twins’ predictions and decisions, according to the 2023 report, is through verification, validation, and uncertainty quantification (VVUQ) — a set of processes that, among other tasks, can determine the accuracy of the digital twin’s representation of the real world, and give measures of the quality of its predictions.

These processes (defined in a 2012 National Academies report) have been developed over the past several years to support the simulation and modeling of increasingly complex processes, including machine learning and artificial intelligence.

However, new challenges arise when it comes to VVUQ for digital twins. One issue is the dynamic updates that are a key component of digital twin technology. “We need new methods that can adapt to changes in the models, changes in the data, and changes in the prediction and decision context,” Willcox said.

The 2023 report also cautions that despite the growing use of digital twins and the simulation and modeling technologies that go into them — such as artificial intelligence, machine learning, and empirical modeling — there is no standard process for reporting the results of VVUQ, making it difficult for decision-makers to determine how confident they can be in modeling outputs. The report calls for verification, validation, and uncertainty quantification to be deeply embedded in digital twin technologies from design to deployment.

An upcoming National Academies symposium will examine these concepts in more detail. Assessing the Reliability of Complex, Dynamic Modeling and Simulation, scheduled for June 17, will discuss ways to measure and evaluate the reliability of complex modeling systems such as digital twins, and the contexts in which current VVUQ methods can fall short of providing the safeguards that are needed. The symposium will also explore recent advances and evolutions in VVUQ processes.

“The systems targeted by digital twins are often complex, and as a result, the models representing these systems contain numerous uncertainties,” said Omar Ghattas, Fletcher Stuckey Pratt Chair in Engineering and director of the OPTIMUS Center in the Oden Institute for Computational Engineering and Sciences at the University of Texas at Austin, and a moderator for the upcoming symposium. “Moreover, these systems are critical to the welfare of society. As such, it is essential to rigorously and systematically account for end-to-end uncertainties across the digital twin — from data assimilation and inference to model-predictive decision-making.”

Related Resources

Subscribe to Email from the National Academies
Keep up with all of the activities, publications, and events by subscribing to free updates by email.