Neural networks catching up with finite differences in solving partial differential equations in higher dimensions

12/14/2017
by   V. I. Avrutskiy, et al.
0

Fully connected multilayer perceptrons are used for obtaining numerical solutions of partial differential equations in various dimensions. Independent variables are fed into the input layer, and the output is considered as solution's value. To train such a network one can use square of equation's residual as a cost function and minimize it with respect to weights by gradient descent. Following previously developed method, derivatives of the equation's residual along random directions in space of independent variables are also added to cost function. Similar procedure is known to produce nearly machine precision results using less than 8 grid points per dimension for 2D case. The same effect is observed here for higher dimensions: solutions are obtained on low density grids, but maintain their precision in the entire region. Boundary value problems for linear and nonlinear Poisson equations are solved inside 2, 3, 4, and 5 dimensional balls. Grids for linear cases have 40, 159, 512 and 1536 points and for nonlinear 64, 350, 1536 and 6528 points respectively. In all cases maximum error is less than 8.8·10^-6, and median error is less than 2.4·10^-6. Very weak grid requirements enable neural networks to obtain solution of 5D linear problem within 22 minutes, whereas projected solving time for finite differences on the same hardware is 50 minutes. Method is applied to second order equation, but requires little to none modifications to solve systems or higher order PDEs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/28/2018

Avoiding overfitting of multilayer perceptrons by training derivatives

Resistance to overfitting is observed for neural networks trained with e...
research
05/18/2022

The BLUES function method for second-order partial differential equations: application to a nonlinear telegrapher equation

An analytic iteration sequence based on the extension of the BLUES (Beyo...
research
12/12/2017

Enhancing approximation abilities of neural networks by training derivatives

Method for increasing precision of feedforward networks is presented. Wi...
research
08/05/2019

Solving Partial Differential Equations on Closed Surfaces with Planar Cartesian Grids

We present a general purpose method for solving partial differential equ...
research
09/30/2019

Training-Free Artificial Neural Networks

We present a numerical scheme for the computation of Artificial Neural N...
research
02/06/2023

Solving Maxwell's Equation in 2D with Neural Networks with Local Converging Inputs

In this paper we apply neural networks with local converging inputs (NNL...

Please sign up or login with your details

Forgot password? Click here to reset