A Stochastic Composite Gradient Method with Incremental Variance Reduction

06/24/2019
by   Junyu Zhang, et al.
6

We consider the problem of minimizing the composition of a smooth (nonconvex) function and a smooth vector mapping, where the inner mapping is in the form of an expectation over some random variable or a finite sum. We propose a stochastic composite gradient method that employs an incremental variance-reduced estimator for both the inner vector mapping and its Jacobian. We show that this method achieves the same orders of complexity as the best known first-order methods for minimizing expected-value and finite-sum nonconvex functions, despite the additional outer composition which renders the composite gradient estimator biased. This finding enables a much broader range of applications in machine learning to benefit from the low complexity of incremental variance-reduction methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/29/2019

Multi-Level Composite Stochastic Optimization via Nested Variance Reduction

We consider multi-level composite optimization problems where each mappi...
research
06/20/2018

Stochastic Nested Variance Reduction for Nonconvex Optimization

We study finite-sum nonconvex optimization problems, where the objective...
research
11/13/2017

Variance Reduced methods for Non-convex Composition Optimization

This paper explores the non-convex composition optimization in the form ...
research
12/19/2022

Stochastic Inexact Augmented Lagrangian Method for Nonconvex Expectation Constrained Optimization

Many real-world problems not only have complicated nonconvex functional ...
research
02/07/2018

Improved Incremental First-Order Oracle Complexity of Variance Reduced Methods for Nonsmooth Convex Stochastic Composition Optimization

We consider the nonsmooth convex composition optimization problem where ...
research
05/22/2023

SignSVRG: fixing SignSGD via variance reduction

We consider the problem of unconstrained minimization of finite sums of ...
research
03/02/2021

ZeroSARAH: Efficient Nonconvex Finite-Sum Optimization with Zero Full Gradient Computation

We propose ZeroSARAH – a novel variant of the variance-reduced method SA...

Please sign up or login with your details

Forgot password? Click here to reset