Close

Presentation

PhD Forum
:
(PhD05) Virtual Reality In Situ Visualization with Interactive Steering for Numerical Simulations
Event Type
PhD Forum
Passes
Tags
Molecular Research
Parallel Algorithms
Parallel Applications
Scientific Software Development
Visualization & Virtual Reality
TimeMonday, June 17th1pm - 6pm
LocationFoyers
DescriptionRecent trends in high-performance computing have witnessed a disproportionate increase in computational throughput over I/O bandwidth to the disk. File I/O has thus become a bottleneck for scientific simulations, which generate large amounts of data. In-situ processing, where visualization or analysis run in parallel with simulation, has emerged as a viable solution as it allows the user to select regions of interest to be captured and written to the disk. It also enables steering of simulations.

While in situ visualization has received much research interest, it has not yet been achieved in Virtual Reality (VR) for large simulations. VR, offering immersive 3D visualization and real-time interactivity, has gained vast popularity in scientific visualization, including for the posthoc visualization of simulations. However, VR involves frequent viewpoint changes due to the user’s head and body movements, and requires data to be rendered at high frame rates and low latency, otherwise the user may get sick or lose immersion. The results of large simulations are difficult to render in situ under such constraints.

The primary objective of this thesis is to develop a system that performs, and more generally, algorithms that enable, immersive in-situ visualization and steering of numerical simulations. More specifically, it is our objective to build from the ground up an architecture that optimizes frame rates and latency for in situ rendering, leveraging the power of modern high-performance computers. Immersive visualization has been shown to improve perception of space and geometry, and we believe in situ visualization in VR would significantly aid scientific discovery in complex three-dimensional simulations. We aim to further enhance the user’s immersiveness through the use of Natural User Interface (NUI) based steering of simulations, e.g., the induction of boundary conditions in a fluid simulation through hand gestures. NUIs, interfaces based on body pose and gaze, have been shown to help scientists better understand data.

As part of our effort to build such an architecture, we have developed an open source parallel VR rendering library, which performs both volume and mesh rendering. It uses the high-performance Vulkan API for rendering, and therefore has the potential to harness the power of modern GPUs better than several prevalent in situ solutions, such as Paraview Catalyst and VisIt Libsim. We also develop and maintain an open source library for scalable numerical simulations. It performs dynamic load balancing and is capable of running on accelerators (GPUs).

We intend to implement a tightly coupled linking of the two libraries to minimize the cost of data transfer, with simulation and rendering taking place on the same cluster and, wherever possible, on the same nodes within the cluster. One of the core algorithmic issues we will be addressing is novel-view synthesis of volumetric data, to ensure that latency requirements are met when the user’s head position changes. Another challenge will be the optimal redistribution of simulation data for rendering when not all nodes in the cluster have a GPU. Further, we will investigate NUI based steering of scientific simulations.
Poster PDF