Data Dropout in Arbitrary Basis for Deep Network Regularization

12/04/2017
by   Mostafa Rahmani, et al.
0

An important problem in training deep networks with high capacity is to ensure that the trained network works well when presented with new inputs outside the training dataset. Dropout is an effective regularization technique to boost the network generalization in which a random subset of the elements of the given data and the extracted features are set to zero during the training process. In this paper, a new randomized regularization technique in which we withhold a random part of the data without necessarily turning off the neurons/data-elements is proposed. In the proposed method, of which the conventional dropout is shown to be a special case, random data dropout is performed in an arbitrary basis, hence the designation Generalized Dropout. We also present a framework whereby the proposed technique can be applied efficiently to convolutional neural networks. The presented numerical experiments demonstrate that the proposed technique yields notable performance gain. Generalized Dropout provides new insight into the idea of dropout, shows that we can achieve different performance gains by using different bases matrices, and opens up a new research question as of how to choose optimal bases matrices that achieve maximal performance gain.

READ FULL TEXT
research
06/22/2022

Information Geometry of Dropout Training

Dropout is one of the most popular regularization techniques in neural n...
research
11/21/2016

Generalized Dropout

Deep Neural Networks often require good regularizers to generalize well....
research
11/23/2017

Regularization of Deep Neural Networks with Spectral Dropout

The big breakthrough on the ImageNet challenge in 2012 was partially due...
research
05/09/2017

Learning Deep Networks from Noisy Labels with Dropout Regularization

Large datasets often have unreliable labels-such as those obtained from ...
research
01/22/2018

The Hybrid Bootstrap: A Drop-in Replacement for Dropout

Regularization is an important component of predictive model building. T...
research
01/05/2021

AutoDropout: Learning Dropout Patterns to Regularize Deep Networks

Neural networks are often over-parameterized and hence benefit from aggr...
research
11/04/2016

Information Dropout: Learning Optimal Representations Through Noisy Computation

The cross-entropy loss commonly used in deep learning is closely related...

Please sign up or login with your details

Forgot password? Click here to reset