Notes on Exact Boundary Values in Residual Minimisation

05/06/2021
by   Johannes Müller, et al.
0

We analyse the difference in convergence mode using exact versus penalised boundary values for the residual minimisation of PDEs with neural network type ansatz functions, as is commonly done in the context of physics informed neural networks. It is known that using an L^2 boundary penalty leads to a loss of regularity of 3/2 meaning that approximation in H^2 yields a priori estimates in H^1/2. These notes demonstrate how this loss of regularity can be circumvented if the functions in the ansatz class satisfy the boundary values exactly. Furthermore, it is shown that in this case, the loss function provides a consistent a posteriori error estimator in H^2 norm made by the residual minimisation method. We provide analogue results for linear time dependent problems and discuss the implications of measuring the residual in Sobolev norms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset