From Nesterov's Estimate Sequence to Riemannian Acceleration

01/24/2020
by   Kwangjun Ahn, et al.
0

We propose the first global accelerated gradient method for Riemannian manifolds. Toward establishing our result we revisit Nesterov's estimate sequence technique and develop an alternative analysis for it that may also be of independent interest. Then, we extend this analysis to the Riemannian setting, localizing the key difficulty due to non-Euclidean structure into a certain “metric distortion.” We control this distortion by developing a novel geometric inequality, which permits us to propose and analyze a Riemannian counterpart to Nesterov's accelerated gradient method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/04/2021

A Riemannian Accelerated Proximal Extragradient Framework and its Implications

The study of accelerated gradient methods in Riemannian optimization has...
research
06/07/2018

Towards Riemannian Accelerated Gradient Methods

We propose a Riemannian version of Nesterov's Accelerated Gradient algor...
research
05/25/2023

Accelerated Methods for Riemannian Min-Max Optimization Ensuring Bounded Geometric Penalties

In this work, we study optimization problems of the form min_x max_y f(x...
research
11/26/2022

Accelerated Riemannian Optimization: Handling Constraints with a Prox to Bound Geometric Penalties

We propose a globally-accelerated, first-order method for the optimizati...
research
07/02/2022

Geometric Learning of Hidden Markov Models via a Method of Moments Algorithm

We present a novel algorithm for learning the parameters of hidden Marko...
research
04/15/2021

Accelerated Optimization on Riemannian Manifolds via Discrete Constrained Variational Integrators

A variational formulation for accelerated optimization on normed spaces ...
research
01/14/2021

No-go Theorem for Acceleration in the Hyperbolic Plane

In recent years there has been significant effort to adapt the key tools...

Please sign up or login with your details

Forgot password? Click here to reset