National Academies Press: OpenBook

Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief (2023)

Chapter: Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief

Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×

Image

Opportunities and Challenges for Digital Twins in Engineering

Proceedings of a Workshop—in Brief


The digital twin is an emerging technology that builds on the convergence of computer science, mathematics, engineering, and the life sciences. Digital twins have potential across engineering domains, from aeronautics to renewable energy. On February 7 and 9, 2023, the National Academies of Sciences, Engineering, and Medicine hosted a public, virtual workshop to discuss characterizations of digital twins within the context of engineering and to identify methods for their development and use. Panelists highlighted key technical challenges and opportunities across use cases, as well as areas ripe for research and development (R&D) and investment. The third in a three-part series, this evidence-gathering workshop will inform a National Academies consensus study on research gaps and future directions to advance the mathematical, statistical, and computational foundations of digital twins in applications across science, medicine, engineering, and society.1

PLENARY SESSION 1: DIGITAL TWINS IN STRUCTURAL ENGINEERING

Charles Farrar, Los Alamos National Laboratory (LANL), explained that many digital twins are computer-based digital models of physical systems that interface with data. A ban on system-level nuclear testing2 as well as increased investments in high-performance computing hardware, code development, and verification and validation methods and experiments enabled initial advances in “digital twin technology” at LANL beginning in 1992. He emphasized that a digital twin is shaped by questions; as those questions evolve, the digital twin evolves to incorporate more detailed physical phenomena, geometry, and data and to account for more sources of uncertainty.

Farrar underscored that validation data are often difficult and costly to obtain and replicating actual loading environments is challenging. All real-world structures have variable properties, and incorporating this variability into modeling is particularly difficult. Most structural models are deterministic but their inputs are often probabilistic. Therefore, he said, uncertainty could be incorporated by varying model parameters based on known or assumed probability distributions.

Farrar indicated that digital twins could include physics-based (e.g., finite element), data-driven (e.g., machine

__________________

1 To learn more about the study and to watch videos of the workshop presentations, see https://www.nationalacademies.org/our-work/foundational-research-gaps-and-future-directions-for-digital-twins, accessed February 23, 2023.

2 In September 1992, the Senate passed the Hatfield-Exon-Mitchell Amendment, a 9-month moratorium on nuclear testing that preceded the Comprehensive Nuclear-Test-Ban Treaty of 1996.

Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×

Image

learning [ML] and artificial intelligence [AI]), statistical, or hybrid (e.g., physics-constrained ML) models. Structural models are often developed based on nominal geometry and material properties; obtaining data that enable modeling of residual stresses, initial flaws and imperfections, thermal distributions, geometric variability, and details of joints and interfaces is difficult. He remarked on the need to consider time and length scales as well.

Farrar stressed that understanding the limitations of digital twins is critical to success. All models are limited by assumptions associated with the physics being modeled, the training data, the validation data, the knowledge of the physical system, and the knowledge of the inputs to the physical system. These limitations define the domain in which one has confidence in the “answer” provided by the digital twin, although that confidence will not necessarily be uniform across the domain. He described several research gaps and areas for investment, including the following: quantifying the level of model fidelity sufficient to answer questions asked of the digital twin, quantifying the physical system’s initial or current conditions and incorporating that information into the digital twin, obtaining data for model validation and uncertainty quantification, developing new approaches to human–computer interfaces, and enhancing education with an interdisciplinary class that focuses on digital twins and emphasizes verification and validation.

Derek Bingham, Simon Fraser University, asked Farrar about the difference between complex multiphysics simulations and digital twins. Farrar explained that digital twins have a tighter integration between the data and the model than simulations.

Bingham also asked about the advantages and disadvantages of data-driven and physics-based models as well as how to better integrate them into digital twins to support decision-making. Farrar replied that modeling operational and environmental changes to systems is difficult; however, data could be acquired with in situ monitoring systems. For physical phenomena that lack first principles models but have ample data, he suggested leveraging data-driven approaches either exclusively or to augment the physical model.

PANEL 1: DIGITAL TWIN USE CASES ACROSS INDUSTRIES

Workshop participants heard brief presentations from and discussions among five panelists, each of whom addressed current methods and practices, key technical challenges and opportunities, and R&D and investment needs related to digital twin use cases across their respective industries. Elizabeth Baron, Unity Technologies, presented perspectives from the automotive industry; Karthik Duraisamy, University of Michigan, focused on computational science and fluid dynamics applications; Michael Grieves, Digital Twin Institute, described the use of digital twins in manufacturing settings; Michael Gahn, Rolls-Royce, discussed aircraft engine design and model-based systems engineering; and Dinakar Deshmukh, General Electric, offered perspectives from the aviation industry.

Current Methods and Practices

Baron noted that the automotive industry has been adapting digital twin technologies since the 1990s and emphasized that increased adoption of real-time digital twins could accelerate Industry 4.03 and improve customer-oriented manufacturing, design, and engineering. She defined a digital twin as a dynamic virtual copy of a physical asset, process, system, or environment that behaves identically to its real-world counterpart. It ingests data and replicates processes to predict possible real-world performance outcomes. Processes, tools, and culture are affected, and people play a critical role in testing usability and function in digital twin design. She indicated that digital twins also provide an effective means of communication to account for how people understand and solve problems.

Duraisamy highlighted efforts to train offline dynamic system models to work online—more closely to real time—to attribute causes for events or anomalies; research is under way to create models that both run and make inferences faster. Although challenges related to identifiability, likelihoods and priors, and model errors remain, he noted that many digital twin applications could leverage simple models, algorithms, and decision processes to improve productivity. He described six “classes” of digital twins: Level 1 digital twins provide information to users; Level 2 digital twins assist

__________________

3 Industry 4.0, also referred to as the Fourth Industrial Revolution or 4IR.

Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×

Image

operators with decision support; Level 3 digital twins empower managers for high-value decision-making with confidence; Level 4 digital twins assist organizations in planning and decision-making; Level 5 digital twins empower organizations to better communicate, plan, and consume knowledge; and Level 6 digital twins define organizations’ decisions and create knowledge.

Grieves observed that digital twins typically have three components: (1) the physical space of the product and the environment; (2) the virtual space of the product and the environment; and (3) the connection between the physical and virtual spaces where data from the physical space populate the virtual space, and information developed there is brought back to the physical space.

Gahn described Rolls-Royce’s digital engineering framework, where data flow from physical assets to digital models to continually update them. By mirroring physical engineering processes in a digital realm, customers could improve forecasting, reduce life-cycle cost, increase asset availability, and optimize performance. How a company leverages digital twins depends on the desired outcome—for example, eliminating an engineering activity could enable faster, cheaper progress. He emphasized that cybersecurity is a key consideration for digital twins that directly inform decision-making, and a digital twin of the cyberphysical system is needed to protect against adversarial attacks.

Deshmukh provided an overview of commercial digital twin applications for fleet management. When building a representation of an asset for a digital twin, capturing important sources of manufacturing, operational, and environmental variation is key to understanding how a particular component is behaving in the field. Sophisticated representations that capture these sources of variation component by component and asset by asset illuminate how operations could be differentiated. He underscored that reducing disruption, such as unscheduled engine removal or maintenance for major air carriers, is critical in the aviation industry.

Incorporating questions from workshop participants, Parviz Moin, Stanford University, moderated a discussion among the panelists. In response to a question about the security of centralized digital twins. Gahn replied that it depends both on the asset and the types of data. For example, reliability data are usually collated for a digital twin in a central location, but this creates a target for an adversary. Other applications might leverage the edge instead, which offers more security. He explained that securing the digital twin environment is a significant challenge, with data from the asset, data in transit, data stored between those two points, and data for modeling—if the systems are not secure, one cannot develop confidence in those data or the resulting decisions. Baron added that another level of data security to communicate between multifunctional teams would be useful. Moin wondered how proprietary information is handled and whether digital twins should be open source. Duraisamy noted that open sourcing could be beneficial but difficult for some stakeholders, and standards would be useful. Gahn emphasized the value of reference models, and Baron and Duraisamy suggested that a framework could be open source while instances of the data could be highly protected.

Moin inquired about the current state of the art across digital twins. Duraisamy explained that Level 1 digital twins are successful, and applications for Levels 2 and 3 digital twins exist in industry for simple decision-making where systems can be modeled to a high degree of confidence. Duraisamy added that because products are often designed to operate in well-understood regions, parameterization with real-world data could help digital twins become more useful in real-world settings. However, models often are not configured to handle rare events, and geometry changes might not be considered. If the goal is to use digital twins for difficult problems and difficult decisions, he continued, understanding model form error and developing strategies to quantify that error quickly are critical. Deshmukh noted that digital twins should learn continuously from field observations, and Grieves agreed that models will improve with more real-world data for validation.

Moin asked how to quantify the relationship between the amount of data collected and the confidence level in a digital twin’s prediction, particularly for low-probability,

Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×

Image

high-risk events. Duraisamy said that all sources of uncertainty have to be factored in, and understanding whether the algorithms have the capabilities to make a decision within the right time frame is also key. Deshmukh added that ground-truth reality is critical to build digital twins and that models improve with more data, but a framework is needed to update the models continually. Grieves suggested developing a risk matrix to help understand the probability and potential impacts of particular events and to help determine which and how much data are needed. Moin inquired about the norms across use cases that allow one to standardize the quality of a digital twin. Grieves explained that the use case determines whether an online connection is needed for real-time data collection or if data should be collected over months or years.

Moin posed a question about the strength of coordination across component models in terms of model interoperability, data flow, matching data, and frequency of data update needs. Grieves and Baron indicated that coordination is poor. Baron elaborated that many of the data come from multiple sources, have different formats, and have different levels of tolerance in terms of validation and verification criteria. She described this as a difficult but critical problem to solve for digital twins: unifying the data and presenting them in context would allow for consistency throughout the evaluation.

Technical Challenges and Opportunities

Baron explained that an effective digital twin illuminates where systems relate to one another. Providing contextual relationships between these vertical functions4 (Figure 1) is both a challenge and an opportunity, as every function adds an important level of completeness to the digital twin. Holistic digital twins could build up and break down experiences, using AI to provide information and incorporating human collaboration when insights reveal problems that should be addressed.

Duraisamy observed that discussions about implementing digital twins tend to focus on the sensors, the data, the models, the human–computer interfaces, and the actions. However, he said that to move from models to actions, the following “enablers” should be considered: uncertainty propagation, fast inference, model error quantification, identifiability, causality, optimization and control, surrogates and reduced-order models, and multifidelity information.

Grieves remarked that manufacturing operations is rich with opportunities for digital twins, where the goal is to use the fewest resources to predict and address future problems. For example, digital twins could improve quality control by enabling virtual testing of a product and help inform a supply network that exists alongside a supply chain. He stressed that interoperability and cybersecurity are essential to prevent threats to the safety of the manufacturing floor.

Gahn elaborated on Rolls-Royce’s digital framework, which contains a market twin, program twin, product twin, component twin, production twin, and digital twin. Even within a single organization, each twin requires data from different business units; crossing multiple organizations is even more difficult, with the consideration for protecting intellectual property. Issues might also arise in regulation and certification, and he pointed out that in addition to technical challenges, digital twins have to overcome programmatic, commercial, and legal barriers to ensure the best outcome for the customer.

Deshmukh emphasized that the digital twin “invention to production journey is a team sport.” Challenges include defining the scope (i.e., type of problem and model transparency); migrating the analytics into production (i.e., the outcome determines the use of the edge versus the cloud, and the digital twin has to be capable of identifying analytic degradation and uncertainty with time); and considering data availability in production, a scalable data platform, and the right team balance (i.e., data scientists, software engineers, subject-matter experts, and business owners). Deshmukh remarked that opportunities for digital twins include enhanced asset reliability, planned maintenance, reduced maintenance and inspection burden, and improved efficiency. He

__________________

4 Vertical functions are tasks with performance attributes in systems engineering and design fundamentals with specific requirements that must be satisfied to provide the desired operational capabilities. Each vertical function must also perform with respect to other defined vertical functions for the system to properly work as a whole. (Definition provided by Elizabeth Baron via email on June 7, 2023.)

Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×

Image

Image
FIGURE 1 Cross-functional collaboration enabled by digital twins. SOURCE: Elizabeth Baron, Unity Technologies, presentation to the workshop, February 7, 2023.

stressed that these outcomes emerge by combining automated, continuous, and curated data; expertise from physics and data science; and capable, scalable, and configurable AI and ML.

Incorporating questions from workshop participants, Carolina Cruz-Neira, University of Central Florida, moderated a discussion among the panelists. She asked about regulatory barriers to digital twin adoption. Grieves explained that risk-averse regulatory agencies need to be educated on the value of digital twins, which could be achieved in part by demonstration. The automotive industry has embraced virtual crash testing because many more types of testing can be done at a much lower cost. Although a physical validation will likely always be required, he asserted that better information emerges via virtual testing (assuming the physics are correct). The most significant barrier to progress in this area, he continued, is the established culture of the test community.

Cruz-Neira posed a question about challenges related to the interoperability of and the mathematical foundations for digital twins. Duraisamy commented that when building a digital twin of a product at different stages of its life cycle, quantities of interest vary. However, the level of information needed and how to transfer that information across parts of the life cycle is not well understood. Mathematical models that provide information beyond these quantities of interest are useful for many applications, he said, including for interoperability. Cruz-Neira inquired about the challenges of sustaining a digital twin over a product’s life cycle. Deshmukh said that models might not maintain their level of confidence when put into production; “guardrails” to constantly assess the outcome of the digital twin with respect to the ground-truth verification would be useful. Grieves added that digital twins should be “learning models” that capture information about the degradation of old products to better inform and maintain new products.

In response to a question from Cruz-Neira about the use of digital twins for training, Grieves described this application as a significant opportunity to allow for mistakes to be made in the virtual environment. Gahn said that his team has been pursuing the use of digital twins for training with augmented reality sessions for engine overhaul and maintenance. He added that the

Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×

Image

ability to arrange the manufacturing floor virtually for equipment placement is another opportunity.

Cruz-Neira asked about nontechnical (i.e., cultural, financial, and managerial) barriers to leveraging the full potential of digital twins. Baron explained that every vertical function (e.g., design, engineering, and manufacturing) has its own culture; if issues emerge where these functions converge, questions arise about responsibility. However, she said that high-level leaders should enable employees to work collectively to find affordable solutions by relying on data for contextual insight. Grieves pointed out that decision-makers are not digital natives, and digital twins “look like magic” to them. He advised educating, training, and building trust among decision-makers, ensuring that they understand the potential of the technology and will invest when appropriate. Duraisamy said that no single algorithm exists to solve all and decision-makers should understand that progress happens gradually by assimilating knowledge and incorporating rigor. Deshmukh suggested connecting business outcomes to technology innovations and encouraging leadership to invest incrementally. Gahn agreed that defining a tangible benefit related to specific stakeholders in a language they understand is critical.

Research and Development and Investment

Baron presented a hierarchy of digital twins that described their potential capabilities. The “virtual twin” is a physically accurate, realistic digital representation of an asset, facility, or product that emulates its real-world counterpart. The “connected twin” integrates real-time and right-time data to provide insights into the performance of an asset at specific points in time, which requires significant human-in-the-loop interaction. The “predictive twin” leverages data to predict outcomes for the operations of complex facilities and equipment. The “prescriptive twin” leverages advanced modeling and real-time simulation for potential future scenarios as well as prescriptive analytics. The “autonomous twin,” the “nirvana of digital twins,” would use multiple real-time data streams to learn and make decisions to correct issues automatically and enable predictive and prescriptive analytics. She explained that the success of any digital twin relies on people collaborating. The pathway to this success also leverages increased computing capability to solve problems faster, as well as lessons from the past to better predict and address issues in the future.

Duraisamy stated that research on deriving efficient physics-constrained models, as well as those that assimilate information, is needed; ideally, these models would be probabilistic. Efficient and accurate models and algorithms are key to realizing the full potential of digital twins, he added. Investment in methods and algorithms for scalable inference and uncertainty quantification, identifiability, causality, and physics-constrained modeling is also critical—quantifying and effectively managing model form errors and uncertainties remains problematic. He encouraged the community to focus on open standards, common terminology, verification, and validation and championed foundational mathematics to advance digital twins.

Grieves highlighted the need for further research in ontologies and harmonization among groups; interoperability (from cells, to units, to systems, to systems of systems); causality, correlation, and uncertainty quantification; data–physics fusion; and strategies to change the testing and organizational culture. He pointed out that U.S. digital twin research is currently limited as compared to work in Europe and China, and he proposed that digital twins be introduced to students as part of a revised postsecondary curriculum.

Deshmukh explained that data are at the core of digital twin success, and digital twin adoption is the critical end point. He asserted that a scalable platform for structured and unstructured data and the ability to compute at scale are needed. Integrating data science and domain knowledge is critical to enable decision-making based on analytics to drive process change, he said, and tomorrow’s “disruptors” will manage massive amounts of data and apply advanced analytics with a new level of intelligent decision-making.

Incorporating questions from workshop participants, Conrad Tucker, Carnegie Mellon University, moderated a discussion among the panelists. He asked them to elaborate

Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×

Image

on the significance of “interoperability” for digital twins. Duraisamy defined “interoperability” as managing “information” so that it travels seamlessly across the full chain. Baron explained that nothing is developed in a silo: many data have to interoperate in a digital twin for training, and many groups have to work together on a digital twin for problem solving. Grieves noted that machines on the factory floor have an interoperability problem that could be addressed with platforms that “translate” one machine to another. Although standards might not be needed, he continued, harmonization (especially with smaller manufacturers) is essential.

In response to a question from Tucker, Gahn noted that intrusion detection is only one component of protecting digital twins. Many frameworks are available, but a baseline level of protection prepares users with a plan to recover data and modeling tools if an attack occurs. He suggested that users consider protection both in terms of the cloud and the physical asset that moves around globally. A secure download from the asset to the Internet is needed to capture data, he continued, but questions arise about encryption and managing encryption keys.

Tucker asked the panelists how they would convince skeptics to invest sustainably in digital twins. Deshmukh advised focusing on the business problem that could be solved with digital twins: organizations that adopt the technology could offer better experiences for their customers. Gahn added that, to temper expectations, stakeholders should understand what the digital twin will not do (e.g., predict something that is not in the model). Grieves observed the difficulty of “selling” future opportunities; instead of presenting technical pitches to stakeholders, he suggested discussing potential financial performance and cost reduction in the near term. Baron proposed presenting information in context—volumetrically, visually, functionally, and personally—within a globally connected team. Duraisamy suggested highlighting the “digital threads” that could make products more versatile.

Tucker posed a question about opportunities for engineering education, and Grieves encouraged an integrative approach: curricula should include a focus on real-world problems, which would increase student understanding, motivation, and retention. Duraisamy pointed out that although the current curriculum, shaped in the 1960s, has value, it lacks several components. Students have become better system-level thinkers, which is an important skill to accompany mathematics and physics knowledge. He emphasized that concepts should be connected across the entire degree program. Gahn explained that Rolls-Royce values “T-shaped” engineers with breadth across disciplines. He suggested new strategies to introduce digital twin concepts, including microtraining or microcertifications. Deshmukh commented that digital upskilling for the current workforce is essential.

PLENARY SESSION 2: DIGITAL TWINS FOR NATIONAL SECURITY AND RENEWABLE ENERGY

Grace Bochenek, University of Central Florida, described digital twins as “innovation enablers” that are redefining engineering processes and multiplying capabilities to drive innovation across industries, businesses, and governments. She asserted that this level of innovation is facilitated by a digital twin’s ability to integrate a product’s entire life cycle with performance data and to employ a continuous loop of optimization. Ultimately, digital twins could reduce risk, accelerate time from design to production, and improve decision-making as well as connect real-time data with virtual representations for remote monitoring, predictive capabilities, collaboration among stakeholders, and multiple training opportunities.

Bochenek noted that opportunities exist in the national security arena to test, design, and prototype processes and exercise virtual prototypes in military campaigns or with geopolitical analysis to improve mission readiness. Digital twins could increase the speed of delivery for performance advantage, and digital twin technologies could help the United States keep the required pace of innovation by assessing systems against evolving threats and finding solutions. She pointed out that questions remain about how to connect digital twins across an organization to build stakeholder trust and confidence in decisions and investments.

Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×

Image

Bochenek presented another opportunity to leverage digital twins in the energy sector, which aims to modernize the electric grid with renewable energy against a backdrop of quickly changing regulatory requirements and complex energy systems. She said that the ability to accelerate technology development, to optimize operations, and to use analysis to innovate advanced energy systems at scale is critical. Digital twins could enhance understanding of physical grid assets; help utility companies improve planning; expand training for personnel; and improve the cycle of learning, designing, and testing.

Bochenek summarized that digital twins are decision-making tools that help determine how to use finite resources more efficiently to drive sustainability, develop better ideas, and initiate more productive partnerships with stakeholders. Scale and fidelity are significant digital twin challenges, as is the availability of open, real-time, high-quality data. Data ownership, data management, data interoperability, intellectual property rights, and cybersecurity are also key considerations. She underscored that because energy systems are large and complex, grid modernization demands advanced modeling capabilities and real-time interaction with data for predictive and forensic analysis and energy resource protection.

Incorporating questions from workshop participants, Moin moderated a discussion with Bochenek. He posed a question about the most compelling use cases for digital twins. Bochenek described opportunities to use digital twins to support planning, prediction, and protection for smart cities. Other opportunities exist in homeland security and transportation, but each requires involvement from both local and state governments. Space operations is another exciting application for digital twins, she added.

Moin asked about the progress of implementing digital twin concepts in the postsecondary curriculum. Bochenek responded that developing a workforce that is proficient in and can innovate with digital twin technologies is key to drive the future forward. Because digital twins are interdisciplinary by nature, she stressed that education should be interdisciplinary. Additional thought could be given to the role that humans play in digital twins as well as how to ensure that foundational science informs engineering applications. She also advocated for increased certificate programs and professional education for the current workforce.

PANEL 2: DIGITAL TWIN USE CASES ACROSS INDUSTRIES

Workshop participants heard brief presentations from and discussions among four panelists, each of whom addressed current methods and practices, key technical challenges and opportunities, and R&D and investment needs related to digital twin use cases across their respective industries. José Celaya, SLB (previously Schlumberger), discussed modeling challenges for digital twin value creation in the oil and gas industry; Pamela Kobryn, Air Force Research Laboratory (AFRL), shared her views on aircraft sustainment; Devin Francom, LANL, offered perspectives on stockpile stewardship; and Devin Harris, University of Virginia, discussed large-scale infrastructure systems.

Current Methods and Practices

Celaya pointed out that using the term “digital twin” to describe ongoing computational modeling would not unlock new value or performance. New attributes are critical, and a digital representation could focus on a different perspective of an asset or machine such as reliability. He described digital twins as “living systems” and explained that the large critical assets and the asset fleets that comprise the oil and gas industry require different types of digital twins for optimization. He suggested increased partnership with technology providers to advance data management, use of the cloud, and modeling capabilities for digital twins, which could improve internal productivity and efficiency and enhance products for customers.

Kobryn provided an overview of the AFRL Airframe digital twin program, which focused on better maintaining the structural integrity of military aircraft. The initial goal of the program was to use digital twins to balance the need to avoid the unacceptable risk of catastrophic failure with the need to reduce the amount of downtime for maintenance and prevent complicated and expensive repairs. The program focused on how operators use the fleet—for example, anticipating how they would fly the aircraft and mining past usage

Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×

Image

data to update the forecasts for each aircraft. Kobryn noted that understanding the “state” of the aircraft was also critical—for instance, any deviations from the manufacturing blueprint, as well as inspections and repairs, would be tracked over the aircraft’s lifetime. She explained that all of these data informed simulations to provide timely and actionable information to operators about what maintenance to perform and when. Operators could then plan for downtime, and maintainers could prepare to execute maintenance packages tailored for each physical twin and use them for updates. Moving forward, connecting the simulations across length scales and physical phenomena is key, as is integrating probabilistic analysis. Although much progress has been made, she noted that significant gaps remain before the Airframe digital twin can be adopted by the Department of Defense.

Francom explained that science-based stockpile stewardship emerged after the Comprehensive Nuclear-Test-Ban Treaty of 1996.5 DOE invested in simulation and experimental capabilities to achieve this new level of scientific understanding that leveraged available data and simulation capacity to ensure the safety and functionality of the stockpile. He noted that uncertainty quantification, which is critical in this approach but difficult owing to the different categories of physics and engineering involved, could be better understood with the use of digital families. For example, digital twins have the ability to leverage global and local data to refine the understanding of relevant physics and enhance the accuracy of predictive simulations. He indicated that many tools could be useful in collaboration with digital twins—for instance, design of experiments, surrogate modeling, sensitivity analysis, dimension reduction, calibration and inversion, multifidelity uncertainty quantification, and prediction and decision theory—but gaps in the state of the art for reasoning about uncertainty could be pursued further.

Harris remarked that smart cities are the model for sociotechnical systems of the future. He defined the use of digital twins for smart cities as virtual representations of engineered systems that allow engineers and operators to understand how a system performs over time and under future scenarios—that is, a decision-making tool with a real-time, cyber–human collaboration framework. He discussed the potential use of digital twins to monitor bridge structures, for which timescales are particularly challenging. In the past, sensing tools were installed to find damage over time, but it was difficult to understand localized damage. Structural health monitoring then shifted from purely depending on “experimental” measures to relying on “models,” which are more cost-effective than expensive sensor networks. However, he stressed that tracking aging effects in an operational environment over time is still difficult, and a mechanism that represents the physical system is needed.

Incorporating questions from workshop participants, Cruz-Neira moderated a discussion among the panelists. She posed a question about the difference between a digital twin and a simulation. Celaya replied that a digital twin is a living model that reflects the reality of the physical asset through time. Francom wondered if any value exists in making a concrete distinction between the two; instead, he said that the community should strive for inclusivity, focusing on how the modeling and simulation world could improve digital twins and vice versa. Kobryn agreed and added that the distinction only matters when one wants to spotlight special use cases for digital twins. She pointed out that the aspect of “updating” the digital twin with real-world data makes it unique. Cruz-Neira asked if a point exists when so many data are available on a physical asset that the physics-based model would be abandoned for an empirical model. Harris responded that, historically, that amount of data has never been available for large infrastructure. He explained that a realistic strategy to monitor structural health is to determine the current baseline with existing structures and use temporal measurements over time to understand boundary conditions and loading conditions. Celaya highlighted the value of leveraging physics with statistical inference rather than dynamic systems for these cases.

Cruz-Neira inquired about the critical elements for digital twins in engineering. Kobryn explained that digital twins help make the shift from steady-state,

__________________

5 For more information about the treaty, see CTBTO Preparatory Commission, “The Comprehensive Nuclear-Test-Ban Treaty,” https://www.ctbto.org/our-mission/the-treaty, accessed April 7, 2023.

Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×

Image

single-discipline engineering models and analyses to multidisciplinary efforts that blend the best models and simulations from computational and theoretical spaces with real-world data. Harris emphasized the need to consider what these digital twins will look like when integrated into larger systems.

Cruz-Neira posed a question about how to integrate unit-level models with a system-level twin. Francom pointed out that the system level often reveals one aspect of information and the unit level another. When this happens, the models are mis-specified and model form error exists. Because model form error is difficult to track and quantify, he suggested allowing for disagreement in the models and embracing the uncertainty instead of forcing the models to agree, which creates statistical issues.

Cruz-Neira asked how to collaborate with decision-makers to define the requirements of digital twins to ensure that they are helpful. Kobryn asserted that if the end user is not involved with the digital twin design, resources are likely being spent to solve the wrong problem. At AFRL, design thinking and human-centered design are prioritized. Cruz-Neira highlighted the value of making any assumptions transparent to the end user. Francom noted that a spectrum of digital twins would be useful to explore different assumptions about and explanations for certain phenomena. Kobryn added that model-based systems engineering could make these assumptions explicit for an audience with expertise.

Technical Challenges and Opportunities

Celaya explained that challenges and opportunities for digital twins are business case and context dependent. For example, having a modeler in the loop is not scalable for many use cases, and opportunities exist to better leverage computational power and data access. As a result, the data could move from the physical asset to the digital twin in a computational step, and the digital twin would observe how the physical asset is behaving and adapt. He also suggested collecting data with the specific purpose of the digital twin in mind, and he noted that standards and guidelines could help better link the use case to the model requirements. Instead of leveraging old workflows to sustain digital twins, he proposed developing a new paradigm for optimization to unlock the value proposition of Industry 4.0.

Kobryn presented the following four categories of challenges and opportunities for digital twins: (1) working with end users to understand what information they want from a digital twin and how often that information should be updated; (2) determining how to protect personal privacy and intellectual property, how to secure data and models to prevent unauthorized access and maintain prediction validity, and how to address liability for operational failures; (3) determining the level of simulation fidelity and the level of tailoring for individual assets or operators, and strategies to verify and validate probabilistic simulations; and (4) deciding how to reduce computation time and cost and collect an asset’s state and usage data reliably and affordably.

Francom highlighted uncertainty quantification as both a challenge and an opportunity for any problem with high-consequence decisions. Extrapolation is particularly difficult when a digital twin in one regime is used to make predictions in a different regime, he noted, and balancing the physics and the empirical components of a digital twin is critical. Quantifying model form error is also a key challenge, especially when one cannot rely solely on empirical information, which is sometimes incorrect. To begin to address these issues, he mentioned that LANL is exploring opportunities to use information from neighboring systems and to cut the feedback loop when information could corrupt a digital twin.

Harris indicated that no good mechanism is available to deal with existing assets, even though most community stakeholders are concerned about maintaining the function and safety of existing infrastructure. Identifying a way to translate these old systems into more modern formats is needed, he said, although challenges will arise with scale. Most government agencies manage their own systems with limited resources, and working with end users to understand the problem the digital twins are being designed to address is critical. He explained that accessing data for critical infrastructure is especially difficult, given that the information is not publicly available for safety reasons, as is creating digital representations based on

Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×

Image

decades-old structural plans; however, opportunities exist to leverage new forms of data.

Incorporating questions from workshop participants, Tucker moderated a discussion among the panelists. He asked how a decision would be made to cut the feedback loop of a digital twin when the potential for corrupt information exists, and how one would verify that this was the right decision. Kobryn championed active assessment of data quality with automated technology. She cautioned that as a digital twin continues to be updated in real time, more computational challenges will arise, but implementation of zero trust networks could be beneficial. Celaya explained that domain knowledge could be used to validate information and check for corrupted sensors, corrupted data transportation, or noise. He emphasized the value of both technology and science to address this issue. Tucker wondered how a practitioner would determine whether the source of error is the sensor or the digital twin. Francom noted that the level of trust in the physics and the data is influenced by the fact that the data collection could be flawed. He added that industrial statistics and control charts could help detect whether the sensors are malfunctioning or the system itself is changing. Harris explained that because systems often work off of several sensors, opportunities exist to cross-reference with other sensors to determine whether the sensor or the model is causing the error. Kobryn suggested leveraging Bayesian statistical techniques, where new data are not overweighted. Celaya proposed that models be enhanced to reflect degradation phenomena of systems over time.

Tucker posed a question about the value of optimal experimental design, active learning, optimal sensor placement, and dynamic sensor scheduling. Kobryn described these as significant areas of opportunity for digital twins. For example, by using simulations to determine which test conditions to run and where to place sensors, physical test programs could be reduced and digital twins better calibrated for operation. Francom observed that there is much to be gained from these types of technologies, especially given the high cost of sensors. He highlighted the need to understand what is learned from a sensor; in some cases, sensor capability could be relaxed, but that kind of exercise requires thinking carefully about uncertainty.

Tucker invited the panelists to summarize the most significant opportunities in the digital twin ecosystem. Kobryn detailed an opportunity for product engineering to use digital twin techniques to develop a modeling certification basis for engineering parameters and to validate and design models for deployment. Francom recognized that researchers can learn much from data to improve modeling capabilities; a tighter integration of data and models could help pursue new scientific endeavors. Harris urged the community to gather demonstration cases and to cooperate with clients to understand the parameters in which they work and how a digital twin could solve their problem. Celaya highlighted opportunities for new types of optimization—in sustainability, decarbonization, and environmental protection—as well as for more frequent optimization with digital twins.

Research and Development and Investment

Celaya noted that significant modeling work remains for science to lead to engineering synthesis. He pointed out that methods to address trade-offs (e.g., model utility versus computational cost and speed versus uncertainty) are needed and expressed his support of software-defined modeling as well as contextualizing strong engineering principles in the current drive for empiricism. He encouraged increased investments to enable the scalability of new research as well as to exploit new computation paradigms (e.g., cloud, edge, and Internet of Things) with distributed models. Reflecting on which R&D areas should not receive investments could also be a valuable exercise, he added.

Kobryn described several “enabling technologies” for digital twins. She said that leveraging advances in computer engineering and cloud and mobile computing could help to develop architecture for computing, storing, and transporting; and advances in three-dimensional scanning and sensor technology could help advance understanding of the state of physical systems. The Internet of Things and the democratization of data

Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×

Image

could also be leveraged, and many opportunities exist in physics-based modeling and simulation to improve sensemaking and interpretation of data. Additional opportunities exist to take advantage of big data and AI, but she cautioned against relying too much on either pure data-driven approaches or first principles physics and instead encouraged fusing the two approaches in a modern architecture.

Francom underscored that investments should be made in uncertainty quantification. He described the need both to consider the information that is entering a digital twin and to recognize the many possible ways to obtain the same answer (i.e., nonidentifiability), which is reflected as uncertainty. He also highlighted opportunities for people with different expertise to collaborate to address the challenges inherent in these nonlinear systems.

Harris encouraged investments to enable smart systems, which should be built and preserved with the future in mind to be sustainable, agile, resilient, and durable. These systems could be realized by leveraging automation, advanced sensing, data analytics, and community-centered engagement. He explained that strategic investment is also needed for successful use cases that demonstrate how end users benefit from digital twins, for corporate partnerships, and for interdisciplinary collaboration. Digital twins could progress with increased investment in the void between basic and applied research, he added.

Incorporating questions from workshop participants, Bingham moderated a discussion among the panelists. He wondered if as digital twins become more accurate, users will confuse the “map” they offer for reality. Francom reflected on the problems that could arise if a decision-maker fails to understand that the digital twin is not reality; this is an example of the value of transparency and communication about digital twins. Kobryn highlighted the need for general training in digital literacy, asserting that many decision-makers have nonscientific expertise. She cautioned against overloading the nontechnical workforce with technical information and proposed helping them understand how to use (or not use) data and develop the critical thinking to understand when information can (and cannot) be trusted.

Bingham inquired about top-priority investment areas. Harris championed large-scale investment in a series of interdisciplinary team efforts and in the human aspect of digital twins. In response to a question from Bingham about investments in high-performance computing and efficient algorithms to deal with the complexity and scale of digital twin applications, Kobryn proposed using the high-fidelity capability from high-performance computing to synthesize data to train reduced-order models to take advantage of available data and computing capability at the operating end. Multidisciplinary teams are needed to reduce the order of models to account for relevant physics while leveraging AI and ML, she said.

Bingham asked if investments in education would be beneficial. Instead of creating an entirely “digital curriculum,” Celaya suggested enhancing the current core engineering curriculum with a new focus on computing capability, disparate sources of data, and uncertainty to improve students’ data dexterity and analytical skills. Kobryn proposed that educators focus on systems engineering in context and provide more real-world experiences within multidisciplinary teams (e.g., Capstone projects) to better prepare students for the workforce. Francom suggested that educators build these real-world examples from weather and stock market data. Harris said that the Capstone experience is valuable, but more industry–academia collaborations would be beneficial. Celaya described an opportunity to teach students when to employ empirical science to inform decision-making. He reiterated the value of developing systems thinking, accompanied by open-mindedness, to model reality at different stages of abstraction.

Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×

Image

DISCLAIMER This Proceedings of a Workshop—in Brief was prepared by Linda Casola as a factual summary of what occurred at the workshop. The statements made are those of the rapporteur or individual workshop participants and do not necessarily represent the views of all workshop participants; the planning committee; or the National Academies of Sciences, Engineering, and Medicine.

COMMITTEE ON FOUNDATIONAL RESEARCH GAPS AND FUTURE DIRECTIONS FOR DIGITAL TWINS Karen Willcox (Chair), Oden Institute for Computational Engineering and Sciences, The University of Texas at Austin; Derek Bingham, Simon Fraser University; Caroline Chung, MD Anderson Cancer Center; Julianne Chung, Emory University; Carolina Cruz-Neira, University of Central Florida; Conrad J. Grant, Johns Hopkins University Applied Physics Laboratory; James L. Kinter III, George Mason University; Ruby Leung, Pacific Northwest National Laboratory; Parviz Moin, Stanford University; Lucila Ohno-Machado, Yale University; Colin J. Parris, General Electric; Irene Qualters, Los Alamos National Laboratory; Ines Thiele, National University of Ireland, Galway; Conrad Tucker, Carnegie Mellon University; Rebecca Willett, University of Chicago; and Xinyue Ye, Texas A&M University–College Station. * Italic indicates workshop planning committee member.

REVIEWERS To ensure that it meets institutional standards for quality and objectivity, this Proceedings of a Workshop—in Brief was reviewed by Burcu Akinci, Carnegie Mellon University; Nia Johnson, National Academies of Sciences Engineering, and Medicine; and Parviz Moin, Stanford University. Katiria Ortiz, National Academies of Sciences, Engineering, and Medicine, served as the review coordinator.

STAFF Beth Cady, Senior Program Officer, National Academy of Engineering, and Tho Nguyen, Senior Program Officer, Computer Science and Telecommunications Board (CSTB), Workshop Directors; Brittany Segundo, Program Officer, Board on Mathematical Sciences and Analytics (BMSA), Study Director; Kavita Berger, Director, Board on Life Sciences; Jon Eisenberg, Director, CSTB; Samantha Koretsky, Research Assistant, BMSA; Patricia Razafindrambinina, Associate Program Officer, Board on Atmospheric Sciences and Climate; Michelle Schwalbe, Director, National Materials and Manufacturing Board (NMMB) and BMSA; Erik B. Svedberg, Senior Program Officer, NMMB; and Nneka Udeagbala, Associate Program Officer, CSTB.

SPONSORS This project was supported by Contract FA9550-22-1-0535 with the Department of Defense (Air Force Office of Scientific Research and Defense Advanced Research Projects Agency), Award Number DE-SC0022878 with the Department of Energy, Award HHSN263201800029I with the National Institutes of Health, and Award AWD-001543 with the National Science Foundation.

This material is based on work supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, and Office of Biological and Environmental Research.

This project has been funded in part with federal funds from the National Cancer Institute, National Institute of Biomedical Imaging and Bioengineering, National Library of Medicine, and Office of Data Science Strategy from the National Institutes of Health, Department of Health and Human Services.

Any opinions, findings, conclusions, or recommendations expressed do not necessarily reflect the views of the National Science Foundation.

This proceedings was prepared as an account of work sponsored by an agency of the United States Government. Neither the United States Government nor any agency thereof, nor any of their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States Government or any agency thereof. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof.

SUGGESTED CITATION National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. https://doi.org/10.17226/26927.

Image

Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×
Page 1
Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×
Page 2
Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×
Page 3
Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×
Page 4
Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×
Page 5
Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×
Page 6
Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×
Page 7
Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×
Page 8
Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×
Page 9
Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×
Page 10
Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×
Page 11
Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×
Page 12
Suggested Citation:"Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop - in Brief." National Academies of Sciences, Engineering, and Medicine. 2023. Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. doi: 10.17226/26927.
×
Page 13
Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief Get This Book
×
 Opportunities and Challenges for Digital Twins in Engineering: Proceedings of a Workshop—in Brief
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The digital twin is an emerging technology that builds on the convergence of computer science, mathematics, engineering, and the life sciences. Digital twins have potential across engineering domains, from aeronautics to renewable energy. On February 7 and 9, 2023, the National Academies of Sciences, Engineering, and Medicine hosted a public, virtual workshop to discuss characterizations of digital twins within the context of engineering and to identify methods for their development and use. Panelists highlighted key technical challenges and opportunities across use cases, as well as areas ripe for research and development and investment. The third in a three-part series, this evidence-gathering workshop will inform a National Academies consensus study on research gaps and future directions to advance the mathematical, statistical, and computational foundations of digital twins in applications across science, medicine, engineering, and society.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!