Dissecting the Effects of SGD Noise in Distinct Regimes of Deep Learning

01/31/2023
by   Antonio Sclocchi, et al.
0

Understanding when the noise in stochastic gradient descent (SGD) affects generalization of deep neural networks remains a challenge, complicated by the fact that networks can operate in distinct training regimes. Here we study how the magnitude of this noise T affects performance as the size of the training set P and the scale of initialization α are varied. For gradient descent, α is a key parameter that controls if the network is `lazy' (α≫ 1) or instead learns features (α≪ 1). For classification of MNIST and CIFAR10 images, our central results are: (i) obtaining phase diagrams for performance in the (α,T) plane. They show that SGD noise can be detrimental or instead useful depending on the training regime. Moreover, although increasing T or decreasing α both allow the net to escape the lazy regime, these changes can have opposite effects on performance. (ii) Most importantly, we find that key dynamical quantities (including the total variations of weights during training) depend on both T and P as power laws, and the characteristic temperature T_c, where the noise of SGD starts affecting performance, is a power law of P. These observations indicate that a key effect of SGD noise occurs late in training, by affecting the stopping process whereby all data are fitted. We argue that due to SGD noise, nets must develop a stronger `signal', i.e. larger informative weights, to fit the data, leading to a longer training time. The same effect occurs at larger training set P. We confirm this view in the perceptron model, where signal and noise can be precisely measured. Interestingly, exponents characterizing the effect of SGD depend on the density of data near the decision boundary, as we explain.

READ FULL TEXT
research
09/19/2023

On the different regimes of Stochastic Gradient Descent

Modern deep networks are trained with stochastic gradient descent (SGD) ...
research
12/20/2021

The effective noise of Stochastic Gradient Descent

Stochastic Gradient Descent (SGD) is the workhorse algorithm of deep lea...
research
07/22/2020

Compressing invariant manifolds in neural nets

We study how neural networks compress uninformative input space in model...
research
05/05/2021

Understanding Long Range Memory Effects in Deep Neural Networks

Stochastic gradient descent (SGD) is of fundamental importance in deep l...
research
05/26/2020

Inherent Noise in Gradient Based Methods

Previous work has examined the ability of larger capacity neural network...
research
06/09/2022

Trajectory-dependent Generalization Bounds for Deep Neural Networks via Fractional Brownian Motion

Despite being tremendously overparameterized, it is appreciated that dee...
research
10/04/2022

How deep convolutional neural networks lose spatial information with training

A central question of machine learning is how deep nets manage to learn ...

Please sign up or login with your details

Forgot password? Click here to reset