Complexity Guarantees for Polyak Steps with Momentum

02/03/2020
by   Mathieu Barré, et al.
0

In smooth strongly convex optimization, or in the presence of Hölderian error bounds, knowledge of the curvature parameter is critical for obtaining simple methods with accelerated rates. In this work, we study a class of methods, based on Polyak steps, where this knowledge is substituted by that of the optimal value, f_*. We first show slightly improved convergence bounds than previously known for the classical case of simple gradient descent with Polyak steps, we then derive an accelerated gradient method with Polyak steps and momentum, along with convergence guarantees.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/03/2021

The Instability of Accelerated Gradient Descent

We study the algorithmic stability of Nesterov's accelerated gradient me...
research
04/01/2018

Aggregated Momentum: Stability Through Passive Damping

Momentum is a simple and widely used trick which allows gradient-based o...
research
01/22/2019

Accelerated Linear Convergence of Stochastic Momentum Methods in Wasserstein Distances

Momentum methods such as Polyak's heavy ball (HB) method, Nesterov's acc...
research
07/18/2018

Convergence guarantees for RMSProp and ADAM in non-convex optimization and their comparison to Nesterov acceleration on autoencoders

RMSProp and ADAM continue to be extremely popular algorithms for trainin...
research
06/02/2019

Generalized Momentum-Based Methods: A Hamiltonian Perspective

We take a Hamiltonian-based perspective to generalize Nesterov's acceler...
research
12/13/2017

Potential-Function Proofs for First-Order Methods

This note discusses proofs for convergence of first-order methods based ...
research
12/26/2017

IHT dies hard: Provable accelerated Iterative Hard Thresholding

We study --both in theory and practice-- the use of momentum motions in ...

Please sign up or login with your details

Forgot password? Click here to reset