Randomized least-squares with minimal oversampling and interpolation in general spaces

06/12/2023
by   Abdellah Chkifa, et al.
0

In approximation of functions based on point values, least-squares methods provide more stability than interpolation, at the expense of increasing the sampling budget. We show that near-optimal approximation error can nevertheless be achieved, in an expected L^2 sense, as soon as the sample size m is larger than the dimension n of the approximation space by a constant ratio. On the other hand, for m=n, we obtain an interpolation strategy with a stability factor of order n. The proposed sampling algorithms are greedy procedures based on arXiv:0808.0163 and arXiv:1508.03261, with polynomial computational complexity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/12/2021

Optimal pointwise sampling for L^2 approximation

Given a function u∈ L^2=L^2(D,μ), where D⊂ℝ^d and μ is a measure on D, a...
research
06/18/2018

Gradient Descent-based D-optimal Design for the Least-Squares Polynomial Approximation

In this work, we propose a novel sampling method for Design of Experimen...
research
12/15/2019

Boosted optimal weighted least-squares

This paper is concerned with the approximation of a function u in a give...
research
09/27/2021

On Kosloff Tal-Ezer least-squares quadrature formulas

In this work, we study a global quadrature scheme for analytic functions...
research
06/17/2020

Interpolation and Learning with Scale Dependent Kernels

We study the learning properties of nonparametric ridge-less least squar...
research
10/21/2020

Optimal sampling and Christoffel functions on general domains

We consider the problem of reconstructing an unknown function u∈ L^2(D,μ...
research
02/06/2023

Moving Least Squares Approximation using Variably Scaled Discontinuous Weight Function

Functions with discontinuities appear in many applications such as image...

Please sign up or login with your details

Forgot password? Click here to reset