Adding noise to the input of a model trained with a regularized objective

04/16/2011
by   Salah Rifai, et al.
0

Regularization is a well studied problem in the context of neural networks. It is usually used to improve the generalization performance when the number of input samples is relatively small or heavily contaminated with noise. The regularization of a parametric model can be achieved in different manners some of which are early stopping (Morgan and Bourlard, 1990), weight decay, output smoothing that are used to avoid overfitting during the training of the considered model. From a Bayesian point of view, many regularization techniques correspond to imposing certain prior distributions on model parameters (Krogh and Hertz, 1991). Using Bishop's approximation (Bishop, 1995) of the objective function when a restricted type of noise is added to the input of a parametric function, we derive the higher order terms of the Taylor expansion and analyze the coefficients of the regularization terms induced by the noisy input. In particular we study the effect of penalizing the Hessian of the mapping function with respect to the input in terms of generalization performance. We also show how we can control independently this coefficient by explicitly penalizing the Jacobian of the mapping function on corrupted inputs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/18/2019

Analytic expressions for the output evolution of a deep neural network

We present a novel methodology based on a Taylor expansion of the networ...
research
08/01/2023

Regularization, early-stopping and dreaming: a Hopfield-like setup to address generalization and overfitting

In this work we approach attractor neural networks from a machine learni...
research
04/12/2022

NARX Identification using Derivative-Based Regularized Neural Networks

This work presents a novel regularization method for the identification ...
research
12/21/2015

GraphConnect: A Regularization Framework for Neural Networks

Deep neural networks have proved very successful in domains where large ...
research
01/18/2019

Foothill: A Quasiconvex Regularization Function

Deep neural networks (DNNs) have demonstrated success for many supervise...
research
04/13/2023

Understanding Overfitting in Adversarial Training via Kernel Regression

Adversarial training and data augmentation with noise are widely adopted...
research
08/12/2023

Regularization of Kriging interpolation on irregularly spaced data

Interpolation models are critical for a wide range of applications, from...

Please sign up or login with your details

Forgot password? Click here to reset