Accelerated Stochastic Quasi-Newton Optimization on Riemann Manifolds

04/06/2017
by   Anirban Roychowdhury, et al.
0

We propose an L-BFGS optimization algorithm on Riemannian manifolds using minibatched stochastic variance reduction techniques for fast convergence with constant step sizes, without resorting to linesearch methods designed to satisfy Wolfe conditions. We provide a new convergence proof for strongly convex functions without using curvature conditions on the manifold, as well as a convergence discussion for nonconvex functions. We discuss a couple of ways to obtain the correction pairs used to calculate the product of the gradient with the inverse Hessian, and empirically demonstrate their use in synthetic experiments on computation of Karcher means for symmetric positive definite matrices and leading eigenvalues of large scale data matrices. We compare our method to VR-PCA for the latter experiment, along with Riemannian SVRG for both cases, and show strong convergence results for a range of datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/15/2017

Riemannian stochastic quasi-Newton algorithm with variance reduction and its convergence analysis

Stochastic variance reduction algorithms have recently become popular fo...
research
11/10/2018

R-SPIDER: A Fast Riemannian Stochastic Optimization Algorithm with Curvature Independent Rate

We study smooth stochastic optimization problems on Riemannian manifolds...
research
10/09/2019

Nonconvex stochastic optimization on manifolds via Riemannian Frank-Wolfe methods

We study stochastic projection-free methods for constrained optimization...
research
06/24/2022

Convergence and smoothness analysis of subdivision rules in Riemannian and symmetric spaces

After a discussion on definability of invariant subdivision rules we dis...
research
08/14/2020

On the globalization of Riemannian Newton method

In the present paper, in order to fnd a singularity of a vector field de...
research
02/12/2018

Accelerated Stochastic Matrix Inversion: General Theory and Speeding up BFGS Rules for Faster Second-Order Optimization

We present the first accelerated randomized algorithm for solving linear...

Please sign up or login with your details

Forgot password? Click here to reset