Besides time and cost savings, surrogate models could lead to new scientific discoveries, owing to their ability to handle large volumes of high-dimensional data.
New research from the Lawrence Livermore National Laboratory (LLNL) demonstrated deep learning surrogate models performing as well, and sometimes better, than more expensive simulators.
The researchers tested the surrogate model on complicated inertial confinement fusion problems and found it could accurately emulate scientific processes, while also reducing compute time from half an hour to a few seconds.
The new deep learning-driven model significantly outperformed current surrogate models, while adequately replicating the simulator results.
“The nice thing about doing all this is not only that it makes the analysis easier, because now you have a common space for all these modalities, but we also showed that doing it this way actually gives you better models, better analysis and objectively better results than with baseline approaches,” said LLNL computer scientist Rushil Anirudh.
Alongside time and cost savings, the researchers believe the surrogate model could lead to new scientific discoveries, because of its ability to handle large volumes of high-dimensional data.
“This tool is providing us with a fundamentally different way of connecting simulations to experiments,” said fellow LLNL computer scientist, Timo Bremer. “By building these deep learning models, it allows us to directly predict the full complexity of the simulation data.
“Using this common latent space to correlate all these different modalities is going to be extremely valuable, not just for this particular piece of science, but everything that tries to combine computational sciences with experimental sciences. This is something that could potentially lead to new insights in a way that’s just unfeasible right now,” he added.