Build, Control and Run Digital Twins with ParaView

In this blog post, we demonstrate the central role of scientific visualization in Digital Twins, and how Kitware’s solutions like ParaView or LidarView can bring powerful capabilities with real-time processing of large data, including running AI-based surrogate models for complex predictions.

Visualization in ParaView of an electric drive with live Temperature sensors and interpolations across the 3D model

Scientific Visualization at the core of Digital Twins

The term “Digital Twin” may correspond to many different definitions, from the modeling of a physical phenomenon to an enhanced virtual supervision system of a complex industrial setup. As depicted in figure 1, we consider in this blog post the specific flavor where we want to gather data from an actual system, usually through sensors, feed a numerical model with the real-time data, visualize the result, make predictions with derived quantities, and eventually influence the actual system through a feedback loop.

Figure 1 – Digital Twin of an actual system with feedback loop.

However, this process is not new. It is very similar to the supervision process used in industrial installations. The main difference is the recent technological breakthrough to manage very large data as input, throughput and output of the models, which leverages novel artificial intelligence based techniques like Deep Surrogates for fast predictions at scale. Processing large data also means using High Performance Computing capabilities to tackle data that a classic workstation would not be able to open and process.

ParaView for Digital Twins

Visualization is a powerful tool for Digital Twins. For example, while the Digital Twin is running, being able to extract meaningful information from sensor data and to display it is a natural way to monitor the actual system. But scientific visualization could also be successfully used while building the Digital Twin and monitoring the correctness of the building process. 

When data is massive, ParaView is the de facto tool for data analysis and scientific visualization. Thanks to its client/server architecture, ParaView can interact with large data living on a computer cluster. Only the final rendering is streamed to the user, usually as a flow of images. The user can then interact remotely with the data and trigger new analysis which will be computed on the cluster where the massive data is located. Figure 2 illustrate the wide eco-system of ParaView, including bridges to web frontend, Virtual Reality devices, Python AI libraries, or rendering backend like NVidia Omniverse. This rich environment makes ParaView a key resource to build scalable Digital Twins.

Figure 2. ParaView Ecosystem

Some real use cases are climate predictions made by DKRZ, where ParaView is used to process PetaBytes of data, or Weather prediction scenarios from The Cyprus Institute. ParaView for Digital Twins is also widely used in academic studies. For example the CALM-AA project where aeroacoustic measurements are visualized through Virtual Reality (VR) headsets with live interaction with the data.

We also mention the VESTEC project, funded by the European Union, for urgent decision making using ensemble simulation and in-situ analysis with ParaView Catalyst. This project demonstrates that early feedbacks from ensemble simulations with ParaView have a meaningful impact on urgent decisions and could save time, resources and human lives. These use cases demonstrate that ParaView through Catalyst or Live sources can gather information from a real system, visualize it in 3D and send back information, actually closing the loop.

Now, let’s dive into the challenges that you may face while building your Digital Twin.

Interoperability of data is always a key factor in any scientific workflow, and Digital Twins face this challenge as well. For live sensor processing, ParaView provides a live source module which forces the application to refresh a given input at a fixed frequency. The output could be any VTK data type from simple table values to complex meshes, so you can adapt it to your sensor data.

ParaView has a large ecosystem and you may interact with your data in several ways: with a web application based on trame, a Jupyter Notebook or using virtual reality headsets.

Below is a demonstration of acoustic data visualization in real time, using the OpenXR plugin of ParaView: 

Use Case :  Live LiDAR data visualisation in LidarView

An actual application using the power of ParaView live sources is LidarView, which enables LiDAR sensor streams and data replays within the ParaView system, enabling advanced real-time workflows like Simultaneous Localization and Mapping (SLAM) or Object Detection.

3D Map created of a neighbourhood using a Velodyne Puck sensor, processed with SLAM within LidarView

But simpler cases are also possible like connecting IoT devices to ParaView. Below is a demonstration of flexion sensors connected to ParaView driving the deformation of an airplane wings:

Use Case: Electric Drive Digital Twin from TotalEnergies

An illustration of such a project is one that we have developed with the team of the research center of TotalEnergies – Fuels and Lubricants located in Solaize, France. 

In this context, an electric drive is equipped with about 30  temperature sensors, the raw signals are processed on the test bench with AVL PUMA 2 to be exported through UDP within a custom made packet format. 

Note: Other data transfer protocols could be implemented as long as an open driver allows to capture and decode the transmitted packets.

Similarly as what is done in LidarView with 3D sensor data, they are then received by a Live Source in ParaView, using the open source library libtins in order to abstract the network packet sniffing interface.

Knowing their position on the CAD file, they can then be displayed live, records can be created using the network capture format pcap, and replayed with the built-in player of ParaView.  

We have also added interpolation between the sensors with expert domain knowledge regarding the proper interpolation paradigm with the different blocks of the step files, and dynamically monitor each temperature sensor to check when one stops working correctly and use neighbouring sensors to perform the interpolation on the model. 

Below is an example of visualization of the temperature sensors from the outside as well as the inside of the engine as it is heating up. 

This allows engineers from TotalEnergies to have direct access to a visualization showcasing the measured elements as it is running, and perform visualization analysis within ParaView, either using the whole set of existing filters or even implementing their own.

Additional steps in that use case may include :  

  • Adding physical informed models or deep surrogate models to better interpolate the heat in real time
  • Adjusting physical informed models or deep surrogate models to fit to the sequence of data and visualize their output against real world data 

More generally, any system that provides live data in known positions within a 3D model could benefit from this work, such as temperature probes, air pollution sensors, oxygen level in a building…

Second Challenge: Process your Data in Real Time

Once your data is streamed live in ParaView, the second challenge is to get meaningful insight out of them and eventually run predictive models. ParaView is feature rich and you could:

  • Display the value at 3D positions with dedicated color map in real time
  • Compute derivative quantities with the Python Calculator
  • Interpolate values on a mesh where sensors are missing
  • Accumulate values over time to compute criteria, like failure or fatigue
  • Record data on disk
  • Run predictive models, based on previous simulations

To run prediction in real time, it is very common to apply domain specific models like expert abacus or specific rules. The python API of ParaView lets you run any routine, including loading external libraries.

A common way to apply prediction in real-time is to use surrogate models, either based on mathematical equations (SVD, DMD…) or neural networks (Deepsurrogates). Thankfully, both are supported in ParaView for real-time predictions! 

AI in Digital Twins

Artificial Intelligence (AI) techniques are at the core of modern Digital Twins, because despite the data size growing, then the prediction, visualization and feedback loop should stay as close as possible to real-time. Deep Learning surrogate models are one of the techniques to ensure the expected performance, and ParaView can run inference of Deep Learning model and use the output to display meaningful data [Ribes et al. (2022)]. As described in figure 3, we have demonstrated that the in situ capabilities of ParaView Catalyst can also be leveraged during the Deep Learning network training by monitoring the output at the end of each epoch and tuning the training process for better prediction and performance.

Figure 3. Surrogate Model Training Monitoring with ParaView

Another interesting use case is the segmentation of aerial Lidar. Thanks to LidarView, you are able to feed your Digital Twin with real-time lidar data and run unsupervised segmentations. Below is an example with the Meta “Segment Anything” segmentation model integrated in LidarView. More about this example can be found in this blogpost

Unsupervised segmentation of a tile from Lidar HD program using Segment-lidar algorithm within LidarView

Conclusion

Scientific Visualization tools are at the core of Digital Twins requiring advanced processing, prediction and helping to make decisions and influence the actual system. We have demonstrated that ParaView and its derivatives like LidarView are powerful tools for this purpose. 

Interested in building your Digital Twin with ParaView or LidarView? Reach us to start discussion!

Acknowledgements

This work was partly funded by TotalEnergies and an internal effort of Kitware Europe.

Leave a Reply