Normalized Flat Minima: Exploring Scale Invariant Definition of Flat Minima for Neural Networks using PAC-Bayesian Analysis

01/15/2019
by   Yusuke Tsuzuku, et al.
12

The notion of flat minima has played a key role in the generalization properties of deep learning models. However, existing definitions of the flatness are known to be sensitive to the rescaling of parameters. The issue suggests that the previous definitions of the flatness do not necessarily capture generalization, because generalization is invariant to such rescalings. In this paper, from the PAC-Bayesian perspective, we scrutinize the discussion concerning the flat minima and introduce the notion of normalized flat minima, which is free from the known scale dependence issues. Additionally, we highlight the insufficiency of existing matrix-norm based generalization error bounds. Our modified notion of the flatness does not suffer from the insufficiency, either, suggesting it better captures generalization.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset