Simultaneous Neural Network Approximations in Sobolev Spaces

09/01/2021
by   Sean Hon, et al.
0

We establish in this work approximation results of deep neural networks for smooth functions measured in Sobolev norms, motivated by recent development of numerical solvers for partial differential equations using deep neural networks. The error bounds are explicitly characterized in terms of both the width and depth of the networks simultaneously. Namely, for f∈ C^s([0,1]^d), we show that deep ReLU networks of width 𝒪(NlogN) and of depth 𝒪(LlogL) can achieve a non-asymptotic approximation rate of 𝒪(N^-2(s-1)/dL^-2(s-1)/d) with respect to the 𝒲^1,p([0,1]^d) norm for p∈[1,∞). If either the ReLU function or its square is applied as activation functions to construct deep neural networks of width 𝒪(NlogN) and of depth 𝒪(LlogL) to approximate f∈ C^s([0,1]^d), the non-asymptotic approximation rate is 𝒪(N^-2(s-n)/dL^-2(s-n)/d) with respect to the 𝒲^n,p([0,1]^d) norm for p∈[1,∞).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro