MFPC-Net: Multi-fidelity Physics-Constrained Neural Process

10/03/2020
by   Yating Wang, et al.
0

In this work, we propose a network which can utilize computational cheap low-fidelity data together with limited high-fidelity data to train surrogate models, where the multi-fidelity data are generated from multiple underlying models. The network takes a context set as input (physical observation points, low fidelity solution at observed points) and output (high fidelity solution at observed points) pairs. It uses the neural process to learn a distribution over functions conditioned on context sets and provide the mean and standard deviation at target sets. Moreover, the proposed framework also takes into account the available physical laws that govern the data and imposes them as constraints in the loss function. The multi-fidelity physical constraint network (MFPC-Net) (1) takes datasets obtained from multiple models at the same time in the training, (2) takes advantage of available physical information, (3) learns a stochastic process which can encode prior beliefs about the correlation between two fidelity with a few observations, and (4) produces predictions with uncertainty. The ability of representing a class of functions is ensured by the property of neural process and is achieved by the global latent variables in the neural network. Physical constraints are added to the loss using Lagrange multipliers. An algorithm to optimize the loss function is proposed to effectively train the parameters in the network on an ad hoc basis. Once trained, one can obtain fast evaluations at the entire domain of interest given a few observation points from a new low-and high-fidelity model pair. Particularly, one can further identify the unknown parameters such as permeability fields in elliptic PDEs with a simple modification of the network. Several numerical examples for both forward and inverse problems are presented to demonstrate the performance of the proposed method.

READ FULL TEXT

page 16

page 18

page 20

research
08/28/2019

A multi-fidelity neural network surrogate sampling method for uncertainty quantification

We propose a multi-fidelity neural network surrogate sampling method for...
research
05/27/2021

Neural Network Training Using ℓ_1-Regularization and Bi-fidelity Data

With the capability of accurately representing a functional relationship...
research
02/11/2020

On transfer learning of neural networks using bi-fidelity data for uncertainty propagation

Due to their high degree of expressiveness, neural networks have recentl...
research
12/31/2020

Multi-fidelity Bayesian Neural Networks: Algorithms and Applications

We propose a new class of Bayesian neural networks (BNNs) that can be tr...
research
06/30/2022

A Non-intrusive Approach for Physics-constrained Learning with Application to Fuel Cell Modeling

A data-driven model augmentation framework, referred to as Weakly-couple...
research
12/05/2020

Data-based Discovery of Governing Equations

Most common mechanistic models are traditionally presented in mathematic...
research
10/03/2022

Sequential Brick Assembly with Efficient Constraint Satisfaction

We address the problem of generating a sequence of LEGO brick assembly w...

Please sign up or login with your details

Forgot password? Click here to reset