Convergence of constant step stochastic gradient descent for non-smooth non-convex functions

05/18/2020
by   Pascal Bianchi, et al.
0

This paper studies the asymptotic behavior of the constant step Stochastic Gradient Descent for the minimization of an unknown function F , defined as the expectation of a non convex, non smooth, locally Lipschitz random function. As the gradient may not exist, it is replaced by a certain operator: a reasonable choice is to use an element of the Clarke subdifferential of the random function; an other choice is the output of the celebrated backpropagation algorithm, which is popular amongst practionners, and whose properties have recently been studied by Bolte and Pauwels [7]. Since the expectation of the chosen operator is not in general an element of the Clarke subdifferential BF of the mean function, it has been assumed in the literature that an oracle of BF is available. As a first result, it is shown in this paper that such an oracle is not needed for almost all initialization points of the algorithm. Next, in the small step size regime, it is shown that the interpolated trajectory of the algorithm converges in probability (in the compact convergence sense) towards the set of solutions of the differential inclusion. Finally, viewing the iterates as a Markov chain whose transition kernel is indexed by the step size, it is shown that the invariant distribution of the kernel converge weakly to the set of invariant distribution of this differential inclusion as the step size tends to zero. These results show that when the step size is small, with large probability, the iterates eventually lie in a neighborhood of the critical points of the mean function F .

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/19/2020

On the Almost Sure Convergence of Stochastic Gradient Descent in Non-Convex Problems

This paper analyzes the trajectories of stochastic gradient descent (SGD...
research
07/20/2017

Bridging the Gap between Constant Step Size Stochastic Gradient Descent and Markov Chains

We consider the minimization of an objective function given access to un...
research
04/03/2018

A Constant Step Stochastic Douglas-Rachford Algorithm with Application to Non Separable Regularizations

The Douglas Rachford algorithm is an algorithm that converges to a minim...
research
05/24/2022

Weak Convergence of Approximate reflection coupling and its Application to Non-convex Optimization

In this paper, we propose a weak approximation of the reflection couplin...
research
08/05/2019

Extending the step-size restriction for gradient descent to avoid strict saddle points

We provide larger step-size restrictions for which gradient descent base...
research
09/15/2023

Convergence of ADAM with Constant Step Size in Non-Convex Settings: A Simple Proof

In neural network training, RMSProp and ADAM remain widely favoured opti...
research
06/09/2017

Global Convergence of the (1+1) Evolution Strategy

We establish global convergence of the (1+1)-ES algorithm, i.e., converg...

Please sign up or login with your details

Forgot password? Click here to reset