Optimal-order convergence of Nesterov acceleration for linear ill-posed problems

01/20/2021
by   Stefan Kindermann, et al.
0

We show that Nesterov acceleration is an optimal-order iterative regularization method for linear ill-posed problems provided that a parameter is chosen accordingly to the smoothness of the solution. This result is proven both for an a priori stopping rule and for the discrepancy principle. The essential tool to obtain this result is a representation of the residual polynomials via Gegenbauer polynomials.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2022

Convergence rates of a dual gradient method for constrained linear ill-posed problems

In this paper we consider a dual gradient method for solving linear ill-...
research
07/14/2022

Stochastic mirror descent method for linear ill-posed problems in Banach spaces

Consider linear ill-posed problems governed by the system A_i x = y_i fo...
research
04/13/2021

Optimal Convergence of the Discrepancy Principle for polynomially and exponentially ill-posed Operators under White Noise

We consider a linear ill-posed equation in the Hilbert space setting und...
research
01/24/2022

Stochastic asymptotical regularization for linear inverse problems

We introduce Stochastic Asymptotical Regularization (SAR) methods for th...
research
09/13/2022

Dual gradient flow for solving linear ill-posed problems in Banach spaces

We consider determining the -minimizing solution of ill-posed problem A ...
research
06/24/2019

An entropic Landweber method for linear ill-posed problems

The aim of this paper is to investigate the use of an entropic projectio...
research
06/24/2019

An entropic projection method for linear ill-posed problems

The aim of this paper is to investigate the use of an entropic projectio...

Please sign up or login with your details

Forgot password? Click here to reset