When Neural Networks Fail to Generalize? A Model Sensitivity Perspective

by   Jiajin Zhang, et al.

Domain generalization (DG) aims to train a model to perform well in unseen domains under different distributions. This paper considers a more realistic yet more challenging scenario,namely Single Domain Generalization (Single-DG), where only a single source domain is available for training. To tackle this challenge, we first try to understand when neural networks fail to generalize? We empirically ascertain a property of a model that correlates strongly with its generalization that we coin as "model sensitivity". Based on our analysis, we propose a novel strategy of Spectral Adversarial Data Augmentation (SADA) to generate augmented images targeted at the highly sensitive frequencies. Models trained with these hard-to-learn samples can effectively suppress the sensitivity in the frequency space, which leads to improved generalization performance. Extensive experiments on multiple public datasets demonstrate the superiority of our approach, which surpasses the state-of-the-art single-DG methods.


page 2

page 4

page 6


Learning to Diversify for Single Domain Generalization

Domain generalization (DG) aims to generalize a model trained on multipl...

Learning to Learn Single Domain Generalization

We are concerned with a worst-case scenario in model generalization, in ...

Robust Representation Learning with Self-Distillation for Domain Generalization

Domain generalization is a challenging problem in machine learning, wher...

A Fourier-based Framework for Domain Generalization

Modern deep neural networks suffer from performance degradation when eva...

Domain Generalization via Frequency-based Feature Disentanglement and Interaction

Data out-of-distribution is a meta-challenge for all statistical learnin...

Improving Generalization with Domain Convex Game

Domain generalization (DG) tends to alleviate the poor generalization ca...

Please sign up or login with your details

Forgot password? Click here to reset