Exploring Physical Latent Spaces for Deep Learning

11/21/2022
by   Chloé Paliard, et al.
0

We explore training deep neural network models in conjunction with physical simulations via partial differential equations (PDEs), using the simulated degrees of freedom as latent space for the neural network. In contrast to previous work, we do not impose constraints on the simulated space, but rather treat its degrees of freedom purely as tools to be used by the neural network. We demonstrate this concept for learning reduced representations. It is typically extremely challenging for conventional simulations to faithfully preserve the correct solutions over long time-spans with traditional, reduced representations. This problem is particularly pronounced for solutions with large amounts of small scale features. Here, data-driven methods can learn to restore the details as required for accurate solutions of the underlying PDE problem. We explore the use of physical, reduced latent space within this context, and train models such that they can modify the content of physical states as much as needed to best satisfy the learning objective. Surprisingly, this autonomy allows the neural network to discover alternate dynamics that enable a significantly improved performance in the given tasks. We demonstrate this concept for a range of challenging test cases, among others, for Navier-Stokes based turbulence simulations.

READ FULL TEXT

page 5

page 11

page 12

page 14

page 15

page 20

page 23

page 24

research
06/22/2020

Phase space learning with neural networks

This work proposes an autoencoder neural network as a non-linear general...
research
08/10/2023

GPLaSDI: Gaussian Process-based Interpretable Latent Space Dynamics Identification through Deep Autoencoder

Numerically solving partial differential equations (PDEs) can be challen...
research
12/03/2018

Exploring galaxy evolution with generative models

Context. Generative models open up the possibility to interrogate scient...
research
08/08/2020

A novel residual whitening based training to avoid overfitting

In this paper we demonstrate that training models to minimize the autoco...
research
05/27/2021

Search Spaces for Neural Model Training

While larger neural models are pushing the boundaries of what deep learn...
research
04/02/2019

Data driven approximation of parametrized PDEs by Reduced Basis and Neural Networks

We are interested in the approximation of partial differential equations...
research
07/24/2022

AMS-Net: Adaptive Multiscale Sparse Neural Network with Interpretable Basis Expansion for Multiphase Flow Problems

In this work, we propose an adaptive sparse learning algorithm that can ...

Please sign up or login with your details

Forgot password? Click here to reset