An accelerated first-order method for non-convex optimization on manifolds

08/05/2020
by   Chris Criscitiello, et al.
0

We describe the first gradient methods on Riemannian manifolds to achieve accelerated rates in the non-convex case. Under Lipschitz assumptions on the Riemannian gradient and Hessian of the cost function, these methods find approximate first-order critical points strictly faster than regular gradient descent. A randomized version also finds approximate second-order critical points. Both the algorithms and their analyses build extensively on existing work in the Euclidean case. The basic operation consists in running the Euclidean accelerated gradient descent method (appropriately safe-guarded against non-convexity) in the current tangent space, then moving back to the manifold and repeating. This requires lifting the cost function from the manifold to the tangent space, which can be done for example through the Riemannian exponential map. For this approach to succeed, the lifted cost function (called the pullback) must retain certain Lipschitz properties. As a contribution of independent interest, we prove precise claims to that effect, with explicit constants. Those claims are affected by the Riemannian curvature of the manifold, which in turn affects the worst-case complexity bounds for our optimization algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/21/2023

Decentralized Riemannian Conjugate Gradient Method on the Stiefel Manifold

The conjugate gradient method is a crucial first-order optimization meth...
research
12/07/2020

Acceleration in Hyperbolic and Spherical Spaces

We further research on the acceleration phenomenon on Riemannian manifol...
research
10/23/2020

Escape saddle points faster on manifolds via perturbed Riemannian stochastic recursive gradient

In this paper, we propose a variant of Riemannian stochastic recursive g...
research
04/25/2022

Accelerated Multiplicative Weights Update Avoids Saddle Points almost always

We consider non-convex optimization problems with constraint that is a p...
research
12/30/2011

On The Convergence of Gradient Descent for Finding the Riemannian Center of Mass

We study the problem of finding the global Riemannian center of mass of ...
research
01/27/2022

Restarted Nonconvex Accelerated Gradient Descent: No More Polylogarithmic Factor in the O(ε^-7/4) Complexity

This paper studies the accelerated gradient descent for general nonconve...
research
04/09/2021

A Riemannian smoothing steepest descent method for non-Lipschitz optimization on submanifolds

In this paper, we propose a Riemannian smoothing steepest descent method...

Please sign up or login with your details

Forgot password? Click here to reset