Sequential sampling for optimal weighted least squares approximations in hierarchical spaces

05/28/2018
by   Benjamin Arras, et al.
0

We consider the problem of approximating an unknown function u∈ L^2(D,ρ) from its evaluations at given sampling points x^1,...,x^n∈ D, where D⊂R^d is a general domain and ρ is a probability measure. The approximation is picked in a linear space V_m where m=(V_m) and computed by a weighted least squares method. Recent results show the advantages of picking the sampling points at random according to a well-chosen probability measure μ that depends both on V_m and ρ. With such a random design, the weighted least squares approximation is proved to be stable with high probability, and having precision comparable to that of the exact L^2(D,ρ)-orthonormal projection onto V_m, in a near-linear sampling regime n∼m m. The present paper is motivated by the adaptive approximation context, in which one typically generates a nested sequence of spaces (V_m)_m≥1 with increasing dimension. Although the measure μ=μ_m changes with V_m, it is possible to recycle the previously generated samples by interpreting μ_m as a mixture between μ_m-1 and an update measure σ_m. Based on this observation, we discuss sequential sampling algorithms that maintain the stability and approximation properties uniformly over all spaces V_m. Our main result is that the total number of computed sample at step m remains of the order mm with high probability. Numerical experiments confirm this analysis.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro