1805 / Accelerating graph networks for real-time physics simulations
Paper presented at the 16th European-African Regional Conference of the ISTVS
Last updated
Paper presented at the 16th European-African Regional Conference of the ISTVS
Last updated
© International Society for Terrain-Vehicle Systems :: www.istvs.org
Title: Accelerating graph networks for real-time physics simulations
Authors: Mohammad Jasim Usmani and Krzysztof Skonieczny
Abstract: Physics-based simulations are crucial for analyzing interactions between physical objects and are very commonly used in areas such as terramechanics. The discrete element method is an accurate way to model the interactions between the soil particles, however, it is very computationally intensive. One promising alternative is to use Graph Network architecture which uses graph neural networks to learn and approximate the dynamics of the physical system fed into it. Even though this is faster, due to the high dimensionality of the dataset, the inference is performed on a very large graph and rollouts are not in real-time. To accelerate the simulation, one way developed recently in our laboratory is to use dimensionality reduction techniques such as principal components analysis to identify the modes in the dataset that carry the highest variance. These modes are then fed into the Graph Network along with the rigid body to train the model. Although the reduced-order dataset results in a much smaller graph than the original dataset, the graph fed into the network is still naively fully connected. This research aims to further accelerate this subspace framework by identifying a partial graph for the network without compromising either the performance or training. This identification of the partial modes is done by Neural Relational Inference (NRI). The NRI model is based on a variational autoencoder where the encoder can extract a partial graph that can still perform inference on the GNS without loss in accuracy. We demonstrate the effectiveness of our approach on blade cutting as well as wheel simulation datasets and show that this framework has reduced the inference time from 0.35 seconds per second of simulation to 0.17 seconds, an approximately 2x speed up.
Order the full paper:
ISTVS members: receive three papers per year as part of your membership via the ISTVS Member Portal: