Improved Mixed-Example Data Augmentation

05/29/2018
by   Cecilia Summers, et al.
0

In order to reduce overfitting, neural networks are typically trained with data augmentation, the practice of artificially generating additional training data via label-preserving transformations of existing training examples. Recent work has demonstrated a surprisingly effective type of non-label-preserving data augmentation, in which pairs of training examples are averaged together. In this work, we generalize this "mixed-example data augmentation", which allows us to find methods that improve upon previous work. This generalization also reveals that linearity is not necessary as an inductive bias in order for mixed-example data augmentation to be effective, providing evidence against the primary theoretical hypothesis from prior work.

READ FULL TEXT
research
06/11/2018

Data augmentation instead of explicit regularization

Modern deep artificial neural networks have achieved impressive results ...
research
04/30/2020

When does data augmentation help generalization in NLP?

Neural models often exploit superficial ("weak") features to achieve goo...
research
03/26/2023

Analyzing Effects of Mixed Sample Data Augmentation on Model Interpretability

Data augmentation strategies are actively used when training deep neural...
research
07/06/2020

On Data Augmentation and Adversarial Risk: An Empirical Analysis

Data augmentation techniques have become standard practice in deep learn...
research
06/07/2021

MixRL: Data Mixing Augmentation for Regression using Reinforcement Learning

Data augmentation is becoming essential for improving regression accurac...
research
02/25/2020

On Feature Normalization and Data Augmentation

Modern neural network training relies heavily on data augmentation for i...
research
11/12/2019

Learning from Data-Rich Problems: A Case Study on Genetic Variant Calling

Next Generation Sequencing can sample the whole genome (WGS) or the 1-2 ...

Please sign up or login with your details

Forgot password? Click here to reset