Exploring Physical Latent Spaces for Deep Learning

by   Chloé Paliard, et al.

We explore training deep neural network models in conjunction with physical simulations via partial differential equations (PDEs), using the simulated degrees of freedom as latent space for the neural network. In contrast to previous work, we do not impose constraints on the simulated space, but rather treat its degrees of freedom purely as tools to be used by the neural network. We demonstrate this concept for learning reduced representations. It is typically extremely challenging for conventional simulations to faithfully preserve the correct solutions over long time-spans with traditional, reduced representations. This problem is particularly pronounced for solutions with large amounts of small scale features. Here, data-driven methods can learn to restore the details as required for accurate solutions of the underlying PDE problem. We explore the use of physical, reduced latent space within this context, and train models such that they can modify the content of physical states as much as needed to best satisfy the learning objective. Surprisingly, this autonomy allows the neural network to discover alternate dynamics that enable a significantly improved performance in the given tasks. We demonstrate this concept for a range of challenging test cases, among others, for Navier-Stokes based turbulence simulations.


page 5

page 11

page 12

page 14

page 15

page 20

page 23

page 24


Phase space learning with neural networks

This work proposes an autoencoder neural network as a non-linear general...

GPLaSDI: Gaussian Process-based Interpretable Latent Space Dynamics Identification through Deep Autoencoder

Numerically solving partial differential equations (PDEs) can be challen...

Exploring galaxy evolution with generative models

Context. Generative models open up the possibility to interrogate scient...

A novel residual whitening based training to avoid overfitting

In this paper we demonstrate that training models to minimize the autoco...

Search Spaces for Neural Model Training

While larger neural models are pushing the boundaries of what deep learn...

Data driven approximation of parametrized PDEs by Reduced Basis and Neural Networks

We are interested in the approximation of partial differential equations...

AMS-Net: Adaptive Multiscale Sparse Neural Network with Interpretable Basis Expansion for Multiphase Flow Problems

In this work, we propose an adaptive sparse learning algorithm that can ...

Please sign up or login with your details

Forgot password? Click here to reset