Stable Rank Normalization for Improved Generalization in Neural Networks and GANs

06/11/2019
by   Amartya Sanyal, et al.
5

Exciting new work on the generalization bounds for neural networks (NN) given by Neyshabur et al. , Bartlett et al. depend on two parameter-depenedent quantities: the Lipschitz constant upper-bound and the stable rank (a softer version of the rank operator). This leads to an interesting question of whether controlling these quantities might improve the generalization behaviour of NNs. To this end, we propose stable rank normalization (SRN), a novel, optimal, and computationally efficient weight-normalization scheme which minimizes the stable rank of a linear operator. Surprisingly we find that SRN, inspite of being non-convex problem, can be shown to have a unique optimal solution. Moreover, we show that SRN allows control of the data-dependent empirical Lipschitz constant, which in contrast to the Lipschitz upper-bound, reflects the true behaviour of a model on a given dataset. We provide thorough analyses to show that SRN, when applied to the linear layers of a NN for classification, provides striking improvements-11.3 standard NN along with significant reduction in memorization. When applied to the discriminator of GANs (called SRN-GAN) it improves Inception, FID, and Neural divergence scores on the CIFAR 10/100 and CelebA datasets, while learning mappings with low empirical Lipschitz constants.

READ FULL TEXT

page 24

page 25

page 26

page 27

research
05/06/2020

Training robust neural networks using Lipschitz bounds

Due to their susceptibility to adversarial perturbations, neural network...
research
11/04/2021

GraN-GAN: Piecewise Gradient Normalization for Generative Adversarial Networks

Modern generative adversarial networks (GANs) predominantly use piecewis...
research
04/17/2022

Sharper Bounds on Four Lattice Constants

The Korkine–Zolotareff (KZ) reduction, and its generalisations, are wide...
research
06/08/2020

The Lipschitz Constant of Self-Attention

Lipschitz constants of neural networks have been explored in various con...
research
02/16/2021

A Law of Robustness for Weight-bounded Neural Networks

Robustness of deep neural networks against adversarial perturbations is ...
research
02/10/2022

Controlling the Complexity and Lipschitz Constant improves polynomial nets

While the class of Polynomial Nets demonstrates comparable performance t...
research
11/30/2021

Robust and Provably Monotonic Networks

The Lipschitz constant of the map between the input and output space rep...

Please sign up or login with your details

Forgot password? Click here to reset