Polynomial Preconditioning for Gradient Methods

01/30/2023
by   Nikita Doikov, et al.
0

We study first-order methods with preconditioning for solving structured nonlinear convex optimization problems. We propose a new family of preconditioners generated by symmetric polynomials. They provide first-order optimization methods with a provable improvement of the condition number, cutting the gaps between highest eigenvalues, without explicit knowledge of the actual spectrum. We give a stochastic interpretation of this preconditioning in terms of coordinate volume sampling and compare it with other classical approaches, including the Chebyshev polynomials. We show how to incorporate a polynomial preconditioning into the Gradient and Fast Gradient Methods and establish the corresponding global complexity bounds. Finally, we propose a simple adaptive search procedure that automatically chooses the best possible polynomial preconditioning for the Gradient Method, minimizing the objective along a low-dimensional Krylov subspace. Numerical experiments confirm the efficiency of our preconditioning strategies for solving various machine learning problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/09/2017

Tropicalization, symmetric polynomials, and complexity

D. Grigoriev-G. Koshevoy recently proved that tropical Schur polynomials...
research
06/30/2020

Conditional Gradient Methods for convex optimization with function constraints

Conditional gradient methods have attracted much attention in both machi...
research
01/04/2021

First-Order Methods for Convex Optimization

First-order methods for solving convex optimization problems have been a...
research
10/28/2021

Meta Subspace Optimization

Subspace optimization methods have the attractive property of reducing l...
research
08/04/2020

Parallel Newton-Chebyshev Polynomial Preconditioners for the Conjugate Gradient method

In this note we exploit polynomial preconditioners for the Conjugate Gra...
research
08/07/2022

Decomposable Non-Smooth Convex Optimization with Nearly-Linear Gradient Oracle Complexity

Many fundamental problems in machine learning can be formulated by the c...
research
08/01/2022

An Adjoint-Free Algorithm for CNOP via Sampling

In this paper, we propose a sampling algorithm based on statistical mach...

Please sign up or login with your details

Forgot password? Click here to reset