Skip to main content

Currently Skimming:


Pages 72-82

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 72...
... Developing digital twins that do not ignore salient rare events requires rethinking loss functions and performance metrics used in data-driven contexts. A fundamental challenge in decision-making may arise from discrepancies between the data streamed from the physical model and that which is predicted by the digital twin.
From page 73...
... A related set of research questions around optimal sensor placement, sensor steering, and sensor dynamic scheduling is discussed in Chapter 6.  DATA INTEGRATION FOR DIGITAL TWINS  Increased access to diverse and dynamic streams of data from sensors and instruments can inform decision-making and improve model reliability and robustness. The digital twin of a complex physical system often gets data in different formats from multiple sources with different levels of verification and validation (e.g., visual inspection, record of repairs and overhauls, and quantitative sensor data from a limited number of locations)
From page 74...
... For instance, ML methods used within digital twins need to be optimized to facilitate data assimilation with large-scale streaming data, and data assimilation methods that leverage ML models, architectures, and computational frameworks need to be developed. The scalability of data storage, movement, and management solutions becomes an issue as the amount of data collected from digital twin systems increases.  In some settings, the digital twin will face computational resource constraints (e.g., as a result of power constraints)
From page 75...
... Finally, note that the data quality challenges outlined above are present in the large-scale streaming data setting as well, making the challenge of adaptive model training in the presence of anomalies and outliers that may correspond to either sensor failures or salient rare events particularly challenging. Data Fusion and Synchronization  Digital twins can integrate data from different data streams, which provides a means to address missing data or data sparsity, but there are specific concerns regarding data synchronization (e.g., across scales)
From page 76...
... Fundamental chal lenges include aggregating uncertainty across different data modalities and scales as well as addressing missing data. Strategies for data sharing and collaboration must address challenges such as data ownership and intellec tual property issues while maintaining data security and privacy.  Challenges with Data Access and Collaboration Digital twins are an inherently multidisciplinary and collaborative effort.
From page 77...
... There is a gap in the mathematical tools available for assessing data quality, 2 determining appropriate utilization of all available information, understanding how data quality affects the performance of digital twin systems, and guiding the choice of an appropriate algorithm. marked 1 may benefit from initial investment before moving on to gaps marked with a priority of 2.
From page 78...
... It must be noted, however, that for some settings, specification of prior distributions can greatly impact the inferences that a digital twin is meant to provide -- for better or for worse. Digital twins present specific challenges to Bayesian approaches, including the need for good priors that capture tails of distributions, the need to incorporate model errors and updates, and the need for robust and scalable methods under uncertainty and for high-consequence decisions.
From page 79...
... Data-driven regularization approaches that incorporate more realistic priors are necessary for digital twins. Optimization of Numerical Model Parameters Under Uncertainty  Another key challenge is to perform optimization of numerical model parameters (and any additional hyperparameters)
From page 80...
... These updates may be initiated when something in the physical counterpart evolves or in response to changes in the virtual representation, such as improved model parameters, a higher-fidelity model that incorporates new physical understanding, or improvements in scale/resolution. Due to the continual nature of digital twins as well as the presence of errors and noise in the models, the observations, and the initial conditions, sequential data assimilation approaches (e.g.,
From page 81...
... Traceability of model hierarchies and reproducibility of results are not fully considered in existing data assimila tion approaches.  Digital Twin Demands for Actionable Time Scales  Most literature focuses on offline data assimilation, but the assimilation of real-time sensor data for digital twins to be used on actionable time scales will require advancements in data assimilation methods and tight coupling with the control or decision-support task at hand (see Chapter 6) .  For example, the vast, global observing system of the Earth's atmosphere and numerical models of its dynamics and processes are combined in a data assimilation framework to create initial conditions for weather forecasts.
From page 82...
... Increasing computational capacity alone will not address these issues. Large Parameter Spaces and Data-Rich Scenarios  For many digital twins, the sheer number of numerical model parameters that need to be estimated and updated can present computational issues of tractability and identifiability.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.