Weighted Optimization: better generalization by smoother interpolation

06/15/2020
by   Yuege Xie, et al.
0

We provide a rigorous analysis of how implicit bias towards smooth interpolations leads to low generalization error in the overparameterized setting. We provide the first case study of this connection through a random Fourier series model and weighted least squares. We then argue through this model and numerical experiments that normalization methods in deep learning such as weight normalization improve generalization in overparameterized neural networks by implicitly encouraging smooth interpolants.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset