Ghost Noise for Regularizing Deep Neural Networks

05/26/2023
by   Atli Kosson, et al.
0

Batch Normalization (BN) is widely used to stabilize the optimization process and improve the test performance of deep neural networks. The regularization effect of BN depends on the batch size and explicitly using smaller batch sizes with Batch Normalization, a method known as Ghost Batch Normalization (GBN), has been found to improve generalization in many settings. We investigate the effectiveness of GBN by disentangling the induced "Ghost Noise" from normalization and quantitatively analyzing the distribution of noise as well as its impact on model performance. Inspired by our analysis, we propose a new regularization technique called Ghost Noise Injection (GNI) that imitates the noise in GBN without incurring the detrimental train-test discrepancy effects of small batch training. We experimentally show that GNI can provide a greater generalization benefit than GBN. Ghost Noise Injection can also be beneficial in otherwise non-noisy settings such as layer-normalized networks, providing additional evidence of the usefulness of Ghost Noise in Batch Normalization as a regularizer.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/02/2020

Weight and Gradient Centralization in Deep Neural Networks

Batch normalization is currently the most widely used variant of interna...
research
06/09/2019

Four Things Everyone Should Know to Improve Batch Normalization

A key component of most neural network architectures is the use of norma...
research
05/27/2018

Towards a Theoretical Understanding of Batch Normalization

Normalization techniques such as Batch Normalization have been applied v...
research
06/07/2019

The Normalization Method for Alleviating Pathological Sharpness in Wide Neural Networks

Normalization methods play an important role in enhancing the performanc...
research
06/23/2020

Correct Normalization Matters: Understanding the Effect of Normalization On Deep Neural Network Models For Click-Through Rate Prediction

Normalization has become one of the most fundamental components in many ...
research
10/14/2017

Regularizing Deep Neural Networks by Noise: Its Interpretation and Optimization

Overfitting is one of the most critical challenges in deep neural networ...
research
02/26/2023

Training neural networks with structured noise improves classification and generalization

The beneficial role of noise in learning is nowadays a consolidated conc...

Please sign up or login with your details

Forgot password? Click here to reset