An Investigation into Neural Net Optimization via Hessian Eigenvalue Density

01/29/2019
by   Behrooz Ghorbani, et al.
0

To understand the dynamics of optimization in deep neural networks, we develop a tool to study the evolution of the entire Hessian spectrum throughout the optimization process. Using this, we study a number of hypotheses concerning smoothness, curvature, and sharpness in the deep learning literature. We then thoroughly analyze a crucial structural feature of the spectra: in non-batch normalized networks, we observe the rapid appearance of large isolated eigenvalues in the spectrum, along with a surprising concentration of the gradient in the corresponding eigenspaces. In batch normalized networks, these two effects are almost absent. We characterize these effects, and explain how they affect optimization speed through both theory and experiments. As part of this work, we adapt advanced tools from numerical linear algebra that allow scalable and accurate estimation of the entire Hessian spectrum of ImageNet-scale neural networks; this technique may be of independent interest in other applications.

READ FULL TEXT

page 8

page 11

page 18

research
11/16/2018

The Full Spectrum of Deep Net Hessians At Scale: Dynamics with Sample Size

Previous works observed the spectrum of the Hessian of the training loss...
research
01/31/2022

On the Power-Law Spectrum in Deep Learning: A Bridge to Protein Science

It is well-known that the Hessian matters to optimization, generalizatio...
research
12/16/2019

PyHessian: Neural Networks Through the Lens of the Hessian

We present PyHessian, a new scalable framework that enables fast computa...
research
02/22/2018

Hessian-based Analysis of Large Batch Training and Robustness to Adversaries

Large batch size training of Neural Networks has been shown to incur acc...
research
10/01/2019

The asymptotic spectrum of the Hessian of DNN throughout training

The dynamics of DNNs during gradient descent is described by the so-call...
research
12/02/2019

On the Delta Method for Uncertainty Approximation in Deep Learning

The Delta method is a well known procedure used to quantify uncertainty ...
research
08/04/2020

Analytic Characterization of the Hessian in Shallow ReLU Models: A Tale of Symmetry

We consider the optimization problem associated with fitting two-layers ...

Please sign up or login with your details

Forgot password? Click here to reset