SEPTEMBER 2025 I Volume 46, Issue 3
SEPTEMBER 2025
Volume 46 I Issue 3
IN THIS JOURNAL:
- Issue at a Glance
- Chairman’s Message
Technical Articles
- Kernel Model Validation: How To Do It, And Why You Should Care
- Confidence-Based Skip-Lot Sampling
- Eucalyptus – An Analysis Suite for Fault Trees with Uncertainty Quantification
- Digital Twins in Reliability Engineering: Innovations, Challenges and Opportunities
- Competence Measure Enhanced Ensemble Learning Voting Schemes
- Advancing the Test Science of LLM-enabled Systems: A Survey of Factors and Conditions that Matter Most
- Beyond Accuracy: Evaluating Bayesian Neural Networks in a Real-world Application
- Balancing Structure and Flexibility: Evaluating Agile, Waterfall, and Hybrid Methodologies in Aerospace and Defense Projects
Workforce of the Future
- Building Confidence, Interest, and Opportunity: A Social Cognitive Career Theory-Based Analysis of the Young Women in Engineering Outreach Program
News
- Association News
- Chapter News
- Corporate Member News
![]()
Digital Twins in Reliability Engineering:
Innovations, Challenges and Opportunities

James D. Brownlow
Statistician, Air Force Test Center at 412th Test Wing,
Edwards Air Force Base, CA
![]()
Abstract
The digital twin (DT) is a rapidly evolving technology that draws upon a multidisciplinary foundation, integrating principles from computer science, physics, mathematics, statistics, and engineering. Its applications are diverse, spanning industries such as engineering, healthcare, biomedicine, climate changes, renewable energy, and national security. This work aims to discuss the characterization, development, and application of DT as well as to identify both the challenges and opportunities that lie ahead. The study identifies research gaps and a path forward to advance the statistical and computational foundations and applications of DT in the field of reliability engineering and preventive maintenance for the statistical quality control and assurance. Fostering innovation in the total quality management, DT has the potential to enhance operational practices across industries. Leveraging advanced data analytics, data science, machine learning (ML), and artificial intelligence (AI), DT can support monitoring, simulation, and optimization of complex systems, ensuring higher quality, greater reliability, and improved decision-making. Continued investment in DT technologies would contribute to improved system understanding, particularly in engineering design, evaluation, and reliability analysis.
Keywords: Digital twin technology, Predictive analytics, Preventive maintenance, Reliability engineering, Statistical quality management
Introduction
The rapid advancement of information technologies such as the Internet of Things (IoT), cloud computing, big data analytics, machine learning (ML) and artificial intelligence (AI) has enabled the development of virtual representations of physical systems and entire environments, significantly impacting innovation across industries by merging the physical and virtual realms. The concept of digital twin (DT), initially introduced by Dr. Michael Grieves in 2002 and further refined in subsequent studies (Grieves 2017, Grieves 2014), offers a robust framework for this integration. As (VanDerHorn 2021) stated, DT is integral to digital transformation efforts aimed at developing business models that capitalize on data value. They effectively bridge the physical and digital worlds. According to (Grieves 2017), DT comprises three main elements: the real world, the digital world (i.e., a virtual representation of the real world), and bidirectional data connections; see Figure 1 for illustration. Physical-to-virtual (P2V) connections involve transferring data from real-world sensors to the virtual environment, allowing for accurate virtual modeling through technologies such as spectroscopic and 3D vision systems. Virtual-to-physical (V2P) connections facilitate the flow of information and processes from the virtual model to the real world, enabling physical adjustments based on virtual insights; see (Verna 2023). This bidirectional interaction allows physical experimentation data to enhance the virtual model. Conversely, a Digital Shadow is defined as a virtual model that only mirrors the physical system without such interactive feedback; see (Kaewunruen 2018). Hence, DT differs from traditional simulation and modeling, where analysis is typically performed offline and does not maintain a continuous connection between physical and virtual worlds; see (Jones 2020).
For illustration, let us consider a DT system in the context of the process control and quality management. As described in Figure 1, DT is composed of a physical/real system and its virtual/digital replica. Using sensors and IoT devices, the process and quality data are continuously collected from the physical system. This data encompasses product measurements, defects in production, performance tracking, machine parameters, environmental conditions (e.g., temperature and pressure), etc. The stream of data is transmitted to the virtual system via P2V connections. Integrating this data, the virtual model is updated and further refined to better reflect the current physical system. This is aimed to enhance the predictive capabilities of the virtual system, contributing to more robust and adaptive quality management. The virtual system then simulates the behavior of the physical system, and simultaneously, it monitors critical quality parameters such as defect rates, production yields, etc. The output data is then analyzed using various data science tools and techniques. It includes descriptive/predictive/prescriptive data analytics and AI/ML as well as statistical quality control methods such as control charts, Pareto charts, and regression analyses, in order to identify patterns, trends, and potential quality issues. In this step, variations and defects in the manufacturing/production process can be identified, enabling anomaly detection and root cause analysis. Predictive analytics can help forecast future quality performance based on trends. Based on these results, the system generates actionable insights, which include predictions on future quality deviations (e.g., high defect rates, abnormal process behavior) and recommendations for corrective actions to address identified issues (e.g., recalibrating machines, changing process parameters). These insights are sent back to the physical system through V2P connections, which leads to the update/adjustment of process and control parameters in real-time to mitigate defects and optimize quality. The process and quality data are then again collected from the physical system and transmitted to the virtual system. This continuous, real-time monitoring and feedback loop ensures that both physical and virtual systems remain optimized, reducing defects and improving overall product quality and reliability as well as productivity and performance.
In the context of developmental and operational test and evaluation, DT can function as virtual test beds, enabling the refinement of test plans, simulation of operational conditions, and targeted testing (i.e., test scoping) where data is sparse or uncertainty is high. DT may also assist in identifying ‘gaps and margins’ – critical areas where traditional testing is insufficient or too costly. For example, reliability engineers traditionally use reliability block diagrams (RBD) for system-level analysis. DT can extend this capability by dynamically updating failure rates and component interdependencies using live data, offering a more responsive and representative model for mission planning and test and evaluation activities. Upon its emergence as an engineering breakthrough, offering a sophisticated computational framework for representing physical systems, the DT technology has evolved significantly due to advancements in high-performance computing, code development, and experimental validation. DT consists of dynamic models that continuously integrate and adapt data to meet the evolving demands and challenges of the systems they simulate; see (Jones 2020). DT is increasingly recognized as a transformative tool that redefines engineering processes, enhancing capabilities across various sectors. These advanced digital models integrate comprehensive life cycle data with real-time performance information, enabling a continuous loop of optimization. This integration not only reduces risk and accelerates the transition from design to production but also improves decision-making by linking real-time data with virtual representations; see (Liu 2021). Moreover, DT facilitates remote monitoring, predictive analytics, stakeholder collaboration, and diverse training opportunities. Recognizing this critical role of DT in advancing Industry 4.0, the European Network for Business and Industrial Statistics (ENBIS) organized a conference in Copenhagen, Denmark last year, solely dedicated to the advancements and applications of DT. Additionally, the National Academies of Sciences, Engineering, and Medicine (NASEM) held a series of workshops in 2023 and 2024 to address the foundational research gaps and explore future directions for DT.
The design and functionality of a DT system are tailored to address specific questions, which become increasingly complex as they integrate detailed physical phenomena, geometric intricacies, and sources of uncertainty. One major challenge is acquiring validation data, which is often costly and difficult to obtain, compounded by the variability of real-world environments that can cause significant deviations in structural properties. While computational models of physical systems are typically deterministic, the probabilistic nature of input data necessitates the incorporation of uncertainty through varying model parameters based on known or assumed probability distributions; see (NASEM 2024). The DT systems generally employ a combination of physics-based models (e.g., finite element analysis) with data-driven models that utilize AI/ML, statistical models, or hybrid approaches that blend both methodologies. Despite reliance on nominal geometry and material properties, obtaining detailed data such as residual stresses, initial flaws, thermal distributions, and geometric variations remains challenging; see (NASEM 2024). Additionally, DT systems must account for various time and length scales to accurately model system behavior under different conditions; see (VanDerHorn 2021). Recognizing and understanding the limitations of DT systems is crucial for their effective application. These limitations arise from assumptions related to the physical system, training data, validation data, and input conditions. Confidence in DT predictions varies across different domains, highlighting the need to quantify uncertainty and fidelity for specific applications. Modeling operational and environmental changes presents further challenges, though in situ monitoring data can provide valuable insights. For phenomena lacking first-principles models but supported by ample data, data-driven methods can complement or even replace traditional physics-based models, enhancing decision-making and system optimization. Addressing variability in DT models, especially for systems with uncertain inputs, remains a significant area for further research. Robust methods for obtaining validation data and quantifying initial conditions are necessary to manage these uncertainties. Integrating data-driven approaches with advanced computational models facilitates real-time simulation, predictive analytics, and decision-making, particularly for complex systems. Advancements in human-computer interfaces, interdisciplinary education, and improvements in verification and validation (V&V) processes are essential for advancing the field; see (Modoni 2022).

Figure 1: Composition of a digital twin and its feedback loop for quality/reliability, productivity and performance optimization
With the aim of examining the state of the art, challenges and opportunities of the DT technology in enhancing the practices of quality management and reliability engineering, the rest of the paper is organized as follows. First, we explore the DT technology utilized in various industries by integrating real-world data with virtual models, enhancing operational efficiency and reducing costs. Then, we investigate both technical and non-technical challenges faced by the DT technology along with significant opportunities for enhancing system performance, optimizing resource usage, and improving decision-making through advanced data integration and collaborative efforts. Research and development (R&D) needed for DT is discussed in the following section in order to address challenges and leverage opportunities. The concluding remarks summarize the main findings and provide the direction of future research, including further advancements in the model fidelity, integration of data science/analytic tools, ML and AI with the DT technology for enhanced decision-making capabilities.
Current Practices & Benefits of Digital Twins
The evolution of DT technology has significantly impacted a range of industries, including automotive, manufacturing, aviation, and infrastructure. Defined as a dynamic virtual replica of a physical asset, process, or system, DT technology integrates real-world data to predict outcomes and enhance performance. It supports new business models and decision-making systems by leveraging faster optimization algorithms, powerful computing, and vast amounts of data. This technology serves as both a tool for improving product design and engineering and a medium for solving complex problems through enhanced communication and visualization. Below is a list of successful applications of the DT systems in various industrial sectors; see (Guo 2022, Kabashkin 2024, Sadeghi 2024).
- Manufacturing Industry:DT comprises three main components – the physical system, its virtual counterpart, and the data-driven connection between them. This linkage allows for real-time updates to the digital model, optimizing performance forecasts, operational costs, and lifecycle expenses. For example, DT enhances additive manufacturing (AM) by providing real-time insights and predictive analytics that optimize design, production, and quality assurance processes. In other manufacturing domains, DT has been used to support predictive maintenance and real-time monitoring with reported improvements in downtime reduction and product quality in limited case studies. This reduces downtime and improves product quality, thereby enhancing overall operational efficiency. Cybersecurity remains a critical concern though, particularly for systems relying on DT for crucial decision-making, necessitating robust protections against cyber-physical threats.
- Structural Engineering:DT provides significant benefits for monitoring, simulation, and optimization of engineered structures, making it invaluable for enhancing structural reliability, statistical quality control, and preventive maintenance. By leveraging ML, AI, and high-performance computing, DT can simulate intricate structural behaviors, optimize maintenance schedules, and predict system failures, leading to safer and more efficient engineering solutions.
- Automotive Sector:DT technology is pivotal in advancing Industry 4.0. It facilitates real-time simulations and customer-oriented manufacturing processes. In particular, it enables integration of real-time data through IoT devices embedded throughout the vehicle and production process. For example, IoT sensors in modern vehicles monitor parameters such as engine temperature, braking patterns, and fuel efficiency, transmitting this data to DT for performance analysis and predictive maintenance. On the manufacturing side, IoT-enabled assembly lines collect data on component quality, process timing, and equipment status, feeding DT that simulate and optimize production workflows. DT are categorized into six levels of sophistication, from basic informational models (Level 1) to advanced systems that guide decision-making and knowledge creation (Level 6). These models are instrumental in rapidly detecting anomalies and attributing their causes, despite challenges related to accuracy and error quantification.
- Aviation Sector:DT is employed for fleet management, capturing manufacturing and operational variations. It is used to refine aircraft operations and maintenance strategies, achieving greater efficiency and lowering operational costs. These models help in understanding component behavior in the field and reducing disruptions like unscheduled maintenance. Securing data flow within DT ecosystems is essential, with solutions ranging from centralized data hubs to decentralized edge computing.
- Aerospace Engineering:DT creates detailed digital models of aircraft engines, which support performance simulation and predictive maintenance, leading to increased engine reliability and reduced maintenance costs. Early adopters have reported cost savings by using DT to identify software defects earlier in the development lifecycle, reducing the number of physical prototypes and streamlining regression testing across software builds. For example, DT have been used to simulate control system updates in aerospace applications, allowing engineers to validate changes virtually before deployment, leading to measurable reductions in rework and test cycle times.
- System Maintenance:DT is used to maintain structural integrity while balancing the need to reduce downtime with avoiding catastrophic failures. This involves analyzing fleet usage patterns and historical data to inform simulations that provide actionable maintenance insights. Future developments will focus on enhancing simulation accuracy by connecting various physical phenomena and integrating probabilistic analysis.
- Large-scale Infrastructure:DT offer insights into system performance and future scenarios. It is used to monitor bridge structures and track localized damage over time. While traditional sensing methods have been costly and limited, DT provides a cost-effective alternative through modeling although challenges remain in accurately representing aging effects and operational conditions.
- Stockpile Stewardship:DT plays a critical role in ensuring the safety and functionality of nuclear stockpiles. This involves leveraging global and local data to refine predictive simulations and manage uncertainty. Tools such as design of experiments (DOE), surrogate modeling, and uncertainty quantification (UQ) are used in conjunction with DT to improve understanding and accuracy. Addressing gaps in current methodologies for uncertainty reasoning remains a main area for R&D.
- Computational Science:In the fields like fluid dynamics, DT improves modeling accuracy by integrating real-time data with simulations. This is particularly useful in aerospace and energy sectors, where precise simulations are critical. Ongoing research in this area aims to develop offline dynamic system models that operate closer to real-time. Although these models face accuracy issues, they can enhance productivity by using simple algorithms for decision-making.
- Healthcare & Medicine:DT systems are used to create virtual models of patients’ organs, physiological systems, or entire bodies. These patient-specific twins allow clinicians to simulate medical procedures, assess treatment options, and predict patient outcomes before applying them in real life. For instance, DT have been employed to simulate cardiac interventions, evaluate surgical strategies, and optimize implant placements. By integrating data from imaging technologies, electronic health records, and wearable devices, DT support personalized treatment planning and can enhance diagnostic accuracy, treatment safety, and clinical decision-making. These systems also offer potential for remote monitoring and continuous health status tracking, contributing to proactive patient care and early intervention.
- Food & Agriculture:DT in agriculture, often referred to as smart farming applications, combine IoT, remote sensing, and simulation technologies to build virtual models of crops, soil conditions, and environmental factors. These models enable real-time monitoring of crop health, soil moisture, and nutrient levels, supporting more precise and sustainable farming practices. For example, DT can predict crop yields, guide irrigation and fertilization schedules, and help pest control or disease outbreaks early. Farmers use data from satellite imagery, drones, and ground-based sensors to feed and refine the DT, improving operational efficiency and sustainability, reducing waste, and adapting farming strategies to changing environmental conditions.
Another promising application of DT is in training and augmented reality (AR). For example, DT can be used to virtually train engineers for engine overhauls and maintenance. Additionally, modeling a manufacturing floor digitally before installing physical equipment offers opportunities to optimize efficiency and layout. The application of DT also extends to smart city planning, homeland security, transportation, and space operations; see (Kaewunruen 2018). As illustrated, the state of DT technology varies across industries, with simpler decision-support systems (Levels 1 to 3) already in use. Despite the transformative potential, DT faces several challenges such as integrating real-time data into models and ensuring computational efficiency. The main challenges lie in addressing rare events and uncertain scenarios, particularly for more complex problems; see (Kenett 2022). To maximize the potential of DT, continuous learning from real-world observations and improved validation with real-world data are essential. Developing frameworks for real-time model updates will help maintain accuracy and relevance. Also, the role of DT has been evolving across various sectors.
It is important to understand that DT is distinguished from a plain simulation system as it is capable of continuously updating with real-world data, offering a dynamic model that reflects physical reality. Current practices reveal that simply using DT as computational models is insufficient to unlock new value. Main issues include quantifying the relationship between data volume and prediction confidence, particularly for low-probability, high-risk events. Understanding and managing uncertainty, coupled with rapid decision-making, are vital for DT success. Building frameworks that support continuous model updates based on field data will address these challenges. The interoperability of component models and integration of diverse data sources also present significant challenges. Inconsistent data formats and validation criteria hinder seamless DT functionality. Addressing these coordination issues is critical for enabling real-time, context-driven evaluations that enhance overall system reliability and performance. In energy systems and grid modernization, for instance, advanced modeling capabilities and real-time data interaction are crucial for predictive and forensic analysis and resource protection; see (NASEM 2024). The importance of integrating unit-level and system-level models and involving end-users in the design process cannot be overstated. To maximize the effectiveness, DT must incorporate advanced attributes such as a focus on reliability and optimization across different asset types. Collaboration with technology providers is crucial to enhance data management, cloud integration, and modeling capabilities, thereby improving productivity and efficiency. Collaboration with decision-makers and transparency in assumptions are also crucial for developing effective DT systems that address real-world problems and meet user needs.
Challenges & Opportunities of Digital Twins
DT technology presents a comprehensive view of system interconnections, highlighting both technical and non-technical challenges and opportunities in defining these relationships. Achieving a holistic representation of a real system remains challenging at this point. The implementation of DT typically involves sensors, data models, and human-computer interfaces. Effective integration of AI with DT could greatly enhance their ability to detect, interpret, and respond to system behaviors.
1. Technical Challenges & Opportunities
Transitioning from theoretical models to actionable insights necessitates addressing major enablers, including uncertainty propagation, rapid inference, model error quantification, identifiability, causality, optimization, control, reduced-order models, and multifidelity information. Refining these technical aspects is crucial for ensuring DT are efficient, accurate, and capable of real-time decision-making. Currently, a major technical challenge of DT is the scalability of integrating modelers into DT processes; thus, leveraging computational power and enhancing data accessibility are crucial. DT should be designed to adapt based on real-time observations of physical assets, focusing on purposeful data collection and developing new optimization paradigms to unlock Industry 4.0’s potential; see (Sony 2020, Shivam 2022). Other challenges include understanding user requirements for information and update frequency, securing data and models to protect privacy and intellectual property, determining simulation fidelity and validation strategies for probabilistic simulations, and reducing computation time and cost while ensuring reliable and affordable data collection.
Uncertainty quantification (UQ) is considered as both a challenge and opportunity, particularly when extrapolating predictions across different regimes. In testing and evaluation, UQ is particularly critical when data is sparse, as in early development phases. Methods such as Bayesian inference, Monte Carlo simulations, Gaussian process modeling, and bootstrapping can be used to characterize uncertainty in DT predictions. To effectively adopt UQ for these needs, hybrid modeling approaches can be employed as well, combining both data-driven methods and physics-based simulations. This allows for a more comprehensive understanding of uncertainties by leveraging historical data while accounting for fundamental physical laws. In scenarios with limited data, Bayesian inference provides a structured framework to incorporate prior knowledge and update predictions as new data becomes available, improving extrapolation accuracy. For instance, Bayesian hierarchical models can incorporate expert knowledge and field test data to improve prediction intervals. Additionally, developing scalable algorithms for probabilistic inference can facilitate real-time UQ in large-scale systems, enhancing the DT’s ability to adapt to varying conditions. Test data plays a crucial role in this process by providing ground truth observations needed to calibrate and validate these algorithms. In particular, developmental and operational test datasets help define prior distributions, tune model parameters, and assess prediction accuracy under real-world conditions. This iterative feedback from test data improves the reliability of inference and supports the continuous refinement of DT models for robust decision-making.
UQ can also be coupled with ML to continuously refine model predictions, enabling DT systems to handle complex, high-dimensional data and perform reliably across different operational regimes. This approach not only addresses the challenge of UQ but also unlocks new opportunities for improving decision-making in uncertain environments. Therefore, balancing physics-based and empirical components of DT and quantifying model form errors are critical. Exploring opportunities with neighboring systems and managing feedback loops are part of ongoing efforts to address these issues. It should be noted that ML techniques typically require large volumes of high-quality data, which may not always be available in early-stage testing or low-observability systems. In such cases, alternative approaches like Bayesian inference or physics-informed surrogate models may be more effective for updating predictions with limited data. While ML can enhance DT performance under data-rich conditions, its use should be evaluated against data availability, model complexity, and the specific reliability requirements of the application.
For existing infrastructure, translating old systems into modern formats presents significant challenges, especially given the difficulty in accessing critical data and creating digital representations from outdated plans. Despite these challenges, leveraging new data forms offers potential solutions. There is also need for active data quality assessments and robust verification methods for cutting feedback loops when data corruption risks arise. Using domain knowledge, automated technology, and Bayesian statistical techniques can help manage and validate data integrity. Thus, the major opportunities within the DT ecosystem include developing modeling certification for engineering parameters, tighter integration of data and models, and exploring new optimization avenues in sustainability and environmental protection; see (De Ketelaere 2022).
In the manufacturing sector, DT offers substantial opportunities for optimizing resource use and predicting potential issues, leading to enhanced quality control and more efficient supply chain management. However, a major technical challenge in this context is ensuring interoperability between systems and securing data within the DT environment, especially given the risks of cyber threats in highly integrated, data-driven manufacturing processes; see (Alkan 2018). In complex systems such as those utilized by leading aerospace companies, multiple DT’s are required to represent different layers of business operations, encompassing market, product, component, and production. Integrating data across these layers, often spanning multiple organizations, presents challenges including intellectual property protection, regulatory compliance, and certification. These challenges are not only technical but also commercial, legal, and programmatic. The transition from invention to production remains a critical hurdle in DT adoption. This phase requires balancing data availability, scalability, and the expertise of a multidisciplinary team, including data scientists, software engineers, and business leaders. The benefits of DT in this phase include improved asset reliability, planned maintenance, and reduced inspection burdens, all of which stem from combining automated, real-time data collection with ML, AI, and domain expertise; see (Zhu 2022).
2. Non-technical Challenges & Opportunities
There are several non-technical barriers to DT implementation as well. Regulatory agencies, for instance, may be slow to recognize the value of DT due to entrenched practices in traditional physical testing. Demonstrating the efficacy of virtual testing, particularly in areas such as automotive crash testing, could help shift this perspective. While physical validation will likely remain necessary, virtual testing grounded in accurate physics models offers superior data. Changing the culture of the testing community is a significant barrier to broader DT adoption. Interoperability challenges also extend to mathematical models, particularly in maintaining the DT over a product’s life cycle. The information required at various stages of a product’s life differs, and transferring data across these stages remains poorly understood. Ensuring that the DT continues to provide meaningful and actionable information as the product ages is a major technical challenge. Models must continually adapt and learn, using ground-truth verification to maintain accuracy. Cultural, financial, and managerial barriers further complicate DT adoption. Various functional areas, such as design, engineering, and manufacturing, often operate in silos with distinct cultures and decision-making processes. These divisions can create challenges in accountability and problem-solving when integrating DT. Leadership must promote collaboration and data-driven insights to develop affordable, collective solutions. A significant challenge is bridging the knowledge gap for decision-makers, many of whom are not familiar with digital technologies. For these individuals, DT may seem opaque or even magical, making it difficult to fully trust and invest in the technology. Building trust and understanding through education, training, and incremental investment is essential for ensuring that decision-makers recognize the value of DT and support their development and application.
As discussed, the technical and non-technical challenges surrounding DT are multi-faceted. While the potential to improve system performance, reduce operational costs, and enhance decision-making is evident, substantial hurdles remain in data integration, regulatory acceptance, and organizational culture shifts. Success in overcoming these barriers will depend on continuous advancements in data science, AI/ML, and robust collaboration across industries and disciplines.
Research & Development in Digital Twins
While the previous section outlined the current challenges and potential opportunities surrounding DT implementation, this section focuses on the R&D efforts needed to address those challenges and advance the technology. It highlights emerging technical directions, computational methods, and infrastructure required to realize the full potential of DT systems across industries. The evolving landscape of the DT technology is characterized by a hierarchical model, reflecting increasing levels of sophistication and capability. As shown in Table 1, these levels typically range from basic descriptive twins that provide real-time monitoring, to fully autonomous systems capable of decision-making and self-optimization. Each level builds on the previous by incorporating more advanced analytics, greater integration of data sources, and increased autonomy; see (VanDerHorn 2021). At the foundational level, the virtual twin replicates the physical attributes of an asset or facility. The connected twin advances this by integrating real-time data to evaluate performance at specific instances, often requiring human oversight. The predictive twin uses data for forecasting outcomes, while the prescriptive twin employs advanced modeling and simulation to recommend future actions. At the highest level, the autonomous twin can independently learn, make decisions, and apply predictive and prescriptive analytics to resolve issues in real-time. The progression of DT technology hinges not only on technological advancements but also on effective collaboration among individuals and organizations. Enhanced computing power and insights gained from past experiences are crucial for predicting and addressing future challenges associated with DT; see (Galetto 2020). Developing physics-constrained, probabilistic models is vital for the successful implementation of DT; see (Verna 2022). Efficient and precise algorithms for scalable inference and UQ are essential, alongside efforts to establish open standards, common terminology, and improved methods for model V&V; see (Galetto 2021).
Table 1: Digital twin maturity model from descriptive to autonomous systems
| Level | DT Type | Description |
| 1 | Descriptive Twin | It provides real-time monitoring using historical or static data. It answers “What is happening?” |
| 2 | Connected Twin | It integrates real-time sensor data, reflecting current asset status. It answers “What is the current state?” |
| 3 | Predictive Twin | It forecasts future behavior based on models and data. It answers “What will happen?” |
| 4 | Prescriptive Twin | It suggests actions to optimize outcomes. It answers “What should we do?” |
| 5 | Autonomous Twin | It takes action autonomously based on AI/ML plus feedback loops. It answers “Can it act on its own?” |
Significant progress is still needed to bridge the gap between scientific advancements and engineering applications in the DT technology. Addressing trade-offs, such as model utility versus computational cost and speed versus uncertainty, is essential. Investment in scalable research and new computational paradigms, including cloud computing and edge technologies, is crucial for advancing DT capabilities. Evaluating areas that may not warrant further investment could also be beneficial. Advancements in computer engineering, cloud computing, and 3D scanning are essential to developing effective DT architectures; see (Genta 2018). Leveraging these technologies, along with the IoT and big data analytics, can enhance the understanding of physical systems, improve data interpretation, and facilitate predictions; see (Franceschini 2018). For instance, the U.S. Air Force Test Center (AFTC) has been utilizing a deep learning LSTM (Long Short-Term Memory) model, a type of recurrent neural network (RNN) architecture, as a DT surrogate to predict forces during aircraft maneuvers, helping to ensure flight safety. Integrating data-driven approaches with first-principles physics within a modern framework is recommended to avoid over-reliance on either method.
In addition, UQ is highlighted as a vital area for R&D with a focus on understanding data variability and addressing non-identifiability in nonlinear systems. Incorporating aleatoric and epistemic uncertainties into DT systems can significantly enhance their effectiveness, particularly in ensuring reliability and optimization across various asset types. Aleatoric uncertainty, which arises from inherent variability or randomness in a system, can be addressed through probabilistic models such as Monte Carlo simulations and Bayesian inference to account for this variability in predictions. Epistemic uncertainty, which stems from incomplete knowledge or data, can be managed by continually updating models as new information is gathered, often through ML techniques like active learning. Integrating these uncertainties into a unified framework for UQ allows DT to generate confidence intervals and/or conformal intervals for predictions, leading to more informed decision-making. Furthermore, combining UQ with reliability analysis tools ensures accurate modeling of potential failures, optimizing maintenance schedules and asset management strategies. This approach enables DT to provide more robust predictions, enhancing reliability and performance across diverse applications. Thus, collaborative efforts among experts from various fields are necessary to tackle these complexities.
Data management is also central to the success of DT. To maximize the potential, scalable platforms capable of handling both structured and unstructured data are required, along with computing power commensurate with the data scale. The DT ecosystems are expected to utilize vast datasets and advanced analytics for improved decision-making. Effective data management ensures not only the availability of relevant data but also its accuracy, consistency, and timeliness – factors that directly impact the reliability of the DT’s outputs. Poor data quality or latency can lead to model drift, incorrect predictions, or delayed responses in critical systems (i.e., GIGO – garbage in, garbage out). Therefore, establishing data governance frameworks, including metadata standards, lifecycle versioning, and access control policies, is essential to maintain the integrity of DT systems over time. Interoperability, involving the seamless transfer of information across systems, remains a critical challenge; see (Verna 2023). Platforms that facilitate communication between different machines and harmonize data formats are crucial for overcoming these challenges. Cybersecurity also poses a significant challenge and it is an active area of R&D. Beyond intrusion detection, there is a need for robust recovery plans for data and models in the event of a cyberattack. Secure data capture and management practices, including encryption and key management, are essential due to the global movement of physical assets.
1. Digital Twins R&D for Quality Management
Maintaining competitiveness in the manufacturing sector requires companies to quickly respond to customer demands while ensuring high product quality; see (Galetto 2021). Quality inspections and evaluations are integral to manufacturing systems, essential for preventing the delivery of defective products to users. These inspections, which can involve human inspectors, automated sensing equipment, or a combination of both, generate valuable structured and unstructured data that can be continuously fed into DT models for ongoing evaluation of semi-finished products, final goods, or production-related components. Monitoring programs, whether embedded in-line or post-process, are key enablers for building accurate, up-to-date digital representations of manufacturing conditions. To ensure consistency over the product lifecycle, it is essential that data from inspections adhere to standardized formats and schemas, allowing seamless integration into DT architectures.
As (Montgomery 2010) stated, effective quality control management is crucial for continuously monitoring production systems and enhancing market competitiveness. Cost-effective inspection procedures are vital for reducing quality-related expenses and maintaining market competitiveness; see (Franceschini 2018). Defects can significantly impact product quality and pricing, making predictive models essential for monitoring production processes, forecasting quality fluctuations, and providing early warnings. This leads to reduced production waste, optimized product yields, and minimized losses; see (He 2022). As demand for customization increases, companies must produce smaller batch sizes and develop accurate defect prediction models to implement effective quality controls. The increasing complexity and dynamic nature of manufacturing resources pose challenges for quality control and management.
Quality 4.0, leveraging Industry 4.0 technologies, addresses these challenges by enhancing quality control and maintenance practices; see (Shivam 2022). It brings numerous benefits, including increased enterprise efficiency, performance, innovation, and improved business models; see (Sony 2020, Verna 2023). Recent trends show a growing embrace of full digitalization across industries, driven by the need for improved productivity, quality, and performance. According to (Guo 2022), DT is pivotal in this transition to Industry 4.0 but despite its transformative potential, existing literature reveals that DT often overlooks quality control measures in manufacturing processes; see (Liu 2021, De Ketelaere 2022, Modoni 2022). Models developed in the literature for identifying product defects often highlight the correlation between assembly complexity and defect rates; see (Alkan 2018, Galetto 2020, Verna 2022). Previous studies underscore the importance of assembly sequencing for improving processes; see (Bisgaard 1997). Integrating these models into a DT framework enables continuous monitoring of production processes and enhances final product quality; see (Zhu 2022). DT can simulate various production scenarios, identify defect-inducing conditions, and provide feedback to the physical system through V2P connections; see Figure 1. Retrospective analysis also aids in detecting quality issues and improving system performance.
DT offers real-time insights into process and equipment conditions through data acquisition and mapping, making them a valuable alternative to traditional quality control methods. As illustrated in Figure 1, DT gathers quality-related data via P2V connections. This data is integrated with various experimental models and defect prediction models; see (Guo 2022). The processed data is then transmitted back to the real world via V2P connections to prevent critical issues and forecast overall process quality. The integration of real-time data from sensors and simulations allows for continuous improvement of both experimental models and real-world components. Advancements in sensing technology and computing power enable DT to transition from offline monitoring to real-time solutions, proving to be powerful tools for quality control across diverse applications. Using the DT approach for quality control in manufacturing processes provides significant advantages in managing the assembly and disassembly of complex product types. It offers efficient simulations and reduces the trial-and-error process, thereby enhancing product quality. The development of DT involves defining the relationship between assembly quality and process control parameters through appropriate defect generation models; see (Ma 2019). Integrating defect prediction models with DT can complement traditional quality control methods, optimizing complex product assembly by combining physical and digital techniques and transitioning from experience-based to data-driven approaches.
For example, (De Ketelaere 2022) suggested integrating quality control with DT using data from multiple sensors to create simulation models for inspecting fruit quality during processing. This approach emphasizes the need for field testing to verify defect generation models’ accuracy, allowing only high-productivity, low-defect scenarios to be tested in the real world, thereby reducing extensive real-world testing. (Kenett 2022) implemented DT for rotating machinery based on a dynamic model for gear fault diagnosis, showcasing the DT’s capabilities in monitoring, diagnosing, prognosticating, and providing analytical prescriptions. In a market characterized by increasingly complex and customized products, an integrated production management and control approach is crucial. The DT systems offer real-time simulation, data-driven analysis, and dynamic feedback, which are essential for optimizing the manufacturing processes of complex products.
Conclusions
The integration of DT technology across industries is rapidly transforming the way systems are monitored, controlled, and optimized. By bridging the physical and digital worlds, DT enables real-time simulations, predictive analytics, and prescriptive insights, enhancing operational efficiency and quality management. The evolving landscape of DT technology holds immense potential across various sectors. In the realm of quality management, DT’s ability to forecast defects, simulate production scenarios, and optimize processes in real time offers significant advantages. As businesses shift toward Industry 4.0, DT plays a pivotal role in realizing the vision of Quality 4.0 by integrating advanced sensing technologies, data analytics, ML and AI models. Doing so, it moves beyond traditional methods, providing data-driven solutions that improve product quality, reduce waste, and enhance overall performance.
However, significant challenges remain such as achieving the necessary scale and fidelity, and addressing data-related issues including ownership, management, interoperability, intellectual property rights, cybersecurity, and the development of open standards. Continuous data integration and enhanced coordination between DT components will be critical to unlocking the full potential of this transformative technology in reliability engineering and quality management. Thus, the future trajectory of the DT technology hinges on a strategic blend of advanced research, investment in robust data infrastructure, and enhanced workforce education. Major challenges such as ensuring interoperability among systems, maintaining stringent cybersecurity measures, and achieving model V&V will be critical to address. By emphasizing real-world applications and demonstrating immediate business advantages, the DT industry is poised for continued evolution and innovation, driving progress across diverse sectors, including the quality management and reliability engineering.
References
Alkan, B., Vera, D.A., Ahmad, M., Ahmad, B., and Harrison, R. (2018). “Complexity in manufacturing systems and its measures: a literature review,” European Journal of Industrial Engineering, 12: 116–150.
Bisgaard, S. (1997). “Designing experiments for tolerancing assembled products,” Technometrics, 39: 142–152.
De Ketelaere, B., De Smeets, B., Verboven, P., Nicolai, B., and Saeys, W. (2022). “Digital twins in quality engineering,” Quality Engineering, 34: 404–408.
Franceschini, F., Galetto, M., Genta, G., and Maisano, D.A. (2018). “Selection of quality-inspection procedures for short-run productions,” International Journal of Advanced Manufacturing Technology, 99: 2537–2547.
Galetto, M., Verna, E., and Genta, G. (2020). “Accurate estimation of prediction models for operator-induced defects in assembly manufacturing processes,” Quality Engineering, 32: 595–613.
Galetto, M., Verna, E., and Genta, G. (2021). “Effect of process parameters on parts quality and process efficiency of fused deposition modeling,” Computers and Industrial Engineering, 156: e107238.
Genta, G., Galetto, M., and Franceschini, F. (2018). “Product complexity and design of inspection strategies for assembly manufacturing processes,” International Journal of Production Research, 56: 4056–4066.
Guo, J. and Lv, Z. (2022). “Application of digital twins in multiple fields,” Multimedia Tools and Applications, 81: 26941–26967.
He, Z., Hu, H., Zhang, M., and Yin, X. (2022). “Product quality prediction based on process data: taking the gearbox of company D as an example,” Quality Engineering, 34: 409–422.
Jones, D., Snider, C., Nassehi, A., Yon, J., and Hicks, B. (2020). “Characterizing the digital twin: a systematic literature review,” CIRP Journal of Manufacturing Science and Technology, 29: 36–52.
Kabashkin, I. (2024). “Digital twin framework for aircraft lifecycle management based on data-driven models,” Mathematics, 12: e2979.
Kaewunruen, S., Rungskunroch, P., and Welsh, J. (2018). “A digital-twin evaluation of net zero energy building for existing buildings,” Sustainability, 11: e159.
Kenett, R.S. and Bortman, J. (2022). “The digital twin in Industry 4.0: a wide-angle perspective,” Quality and Reliability Engineering International, 38: 1357–1366.
Liu, M., Fang, S., Dong, H., and Xu, C. (2021). “Review of digital twin about concepts, technologies, and industrial applications,” Journal of Manufacturing Systems, 58: 346–361.
Ma, Y., Zhou, H., He, H., Jiao, G., and Wei, S. (2019). “A digital twin-based approach for quality control and optimization of complex product assembly,” Proceedings of IEEE International Conference on Artificial Intelligence and Advanced Manufacturing, 762–767.
Modoni, G.E., Stampone, B., and Trotta, G. (2022). “Application of the digital twin for in process monitoring of the micro injection molding process quality,” Computers in Industry, 135: e103568.
Montgomery, D.C., Runger, G.C., and Hubele, N.F. (2010). Engineering Statistics. New York, NY: Wiley & Sons.
National Academies of Sciences, Engineering, and Medicine (2024). Foundational Research Gaps and Future Directions for Digital Twins. Washington, DC: the National Academies Press.
Sadeghi, A., Bellavista, P., Song, W., and Yazdani-Asrami, M. (2024). “Digital twins for condition and fleet monitoring of aircraft: towards more-intelligent electrified aviation systems,” IEEE Access (in print).
Shivam, D. and Gupta, M. (2022). “Quality process re-engineering in Industry 4.0: a BPR perspective,” Quality Engineering, 35: 110–129.
Sony, M., Antony, J., and Douglas, J.A. (2020). “Essential ingredients for the implementation of quality 4.0: a narrative review of literature and future directions for research,” TQM Journal, 32: 779–793.
VanDerHorn, E. and Mahadevan, S. (2021). “Digital twin: generalization, characterization and implementation,” Decision Support Systems, 145: e113524.
Verna, E., Genta, G., Galetto, M., and Franceschini, F. (2022). “Defects-per-unit control chart for assembled products based on defect prediction models,” International Journal of Advanced Manufacturing Technology, 119: 2835–2846.
Verna, E., Puttero, S., Genta, G., and Galetto, M. (2023). “Toward a concept of digital twin for monitoring assembly and disassembly processes,” Quality Engineering, 36: 453–470.
Zhu, X. and Ji, Y. (2022). “A digital twin-based multi-objective optimization method for technical schemes in process industry,” International Journal of Computer Integrated Manufacturing, 36: 443–468.
Author Biographies
David Han M.S., Ph.D. is a Romo Endowed Professor at the University of Texas at San Antonio. He teaches statistics and data science. His research interests include statistical modeling and inference, machine learning, and artificial intelligence applied to lifetime analysis and reliability engineering.
James D. Brownlow Ph.D. in Statistics is a technical expert USAF Statistics Flight, Edwards AFB in California. His research interests include Bayesian analyses, applications of artificial intelligence and stochastic differential equations to flight test.
Dewey Classification: L 681 12


