Explicable hyper-reduced order models on nonlinearly approximated solution manifolds of compressible and incompressible Navier-Stokes equations

by   Francesco Romor, et al.

A slow decaying Kolmogorov n-width of the solution manifold of a parametric partial differential equation precludes the realization of efficient linear projection-based reduced-order models. This is due to the high dimensionality of the reduced space needed to approximate with sufficient accuracy the solution manifold. To solve this problem, neural networks, in the form of different architectures, have been employed to build accurate nonlinear regressions of the solution manifolds. However, the majority of the implementations are non-intrusive black-box surrogate models, and only a part of them perform dimension reduction from the number of degrees of freedom of the discretized parametric models to a latent dimension. We present a new intrusive and explicable methodology for reduced-order modelling that employs neural networks for solution manifold approximation but that does not discard the physical and numerical models underneath in the predictive/online stage. We will focus on autoencoders used to compress further the dimensionality of linear approximants of solution manifolds, achieving in the end a nonlinear dimension reduction. After having obtained an accurate nonlinear approximant, we seek for the solutions on the latent manifold with the residual-based nonlinear least-squares Petrov-Galerkin method, opportunely hyper-reduced in order to be independent from the number of degrees of freedom. New adaptive hyper-reduction strategies are developed along with the employment of local nonlinear approximants. We test our methodology on two nonlinear time-dependent parametric benchmarks involving a supersonic flow past a NACA airfoil with changing Mach number and an incompressible turbulent flow around the Ahmed body with changing slant angle.


page 14

page 18

page 20

page 27

page 33

page 34

page 35

page 36


Efficient nonlinear manifold reduced order model

Traditional linear subspace reduced order models (LS-ROMs) are able to a...

Hyper-Reduced Autoencoders for Efficient and Accurate Nonlinear Model Reductions

Projection-based model order reduction on nonlinear manifolds has been r...

Reduced Order Modeling for Parameterized Time-Dependent PDEs using Spatially and Memory Aware Deep Learning

We present a novel reduced order model (ROM) approach for parameterized ...

Model reduction for the material point method via learning the deformation map and its spatial-temporal gradients

This work proposes a model-reduction approach for the material point met...

A fast and accurate physics-informed neural network reduced order model with shallow masked autoencoder

Traditional linear subspace reduced order models (LS-ROMs) are able to a...

Depth separation for reduced deep networks in nonlinear model reduction: Distilling shock waves in nonlinear hyperbolic problems

Classical reduced models are low-rank approximations using a fixed basis...

The Schwarz alternating method for the seamless coupling of nonlinear reduced order models and full order models

Projection-based model order reduction allows for the parsimonious repre...

Please sign up or login with your details

Forgot password? Click here to reset