Fast Incremental Method for Nonconvex Optimization

03/19/2016
by   Sashank J Reddi, et al.
0

We analyze a fast incremental aggregated gradient method for optimizing nonconvex problems of the form _x ∑_i f_i(x). Specifically, we analyze the SAGA algorithm within an Incremental First-order Oracle framework, and show that it converges to a stationary point provably faster than both gradient descent and stochastic gradient descent. We also discuss a Polyak's special class of nonconvex problems for which SAGA converges at a linear rate to the global optimum. Finally, we analyze the practically valuable regularized and minibatch variants of SAGA. To our knowledge, this paper presents the first analysis of fast convergence for an incremental aggregated gradient method for nonconvex problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2016

Fast Stochastic Methods for Nonsmooth Nonconvex Optimization

We analyze stochastic algorithms for optimizing nonconvex, nonsmooth fin...
research
04/26/2017

Linear Convergence of Accelerated Stochastic Gradient Descent for Nonconvex Nonsmooth Optimization

In this paper, we study the stochastic gradient descent (SGD) method for...
research
02/22/2020

Global Convergence and Variance-Reduced Optimization for a Class of Nonconvex-Nonconcave Minimax Problems

Nonconvex minimax problems appear frequently in emerging machine learnin...
research
08/09/2022

Adaptive Zeroth-Order Optimisation of Nonconvex Composite Objectives

In this paper, we propose and analyze algorithms for zeroth-order optimi...
research
05/26/2017

Residual Expansion Algorithm: Fast and Effective Optimization for Nonconvex Least Squares Problems

We propose the residual expansion (RE) algorithm: a global (or near-glob...
research
02/24/2021

Noisy Gradient Descent Converges to Flat Minima for Nonconvex Matrix Factorization

Numerous empirical evidences have corroborated the importance of noise i...
research
09/05/2015

HAMSI: A Parallel Incremental Optimization Algorithm Using Quadratic Approximations for Solving Partially Separable Problems

We propose HAMSI (Hessian Approximated Multiple Subsets Iteration), whic...

Please sign up or login with your details

Forgot password? Click here to reset