On the Effects of Batch and Weight Normalization in Generative Adversarial Networks

04/13/2017
by   Sitao Xiang, et al.
0

Generative adversarial networks (GANs) are highly effective unsupervised learning frameworks that can generate very sharp data, even for data such as images with complex, highly multimodal distributions. However GANs are known to be very hard to train, suffering from problems such as mode collapse and disturbing visual artifacts. Batch normalization (BN) techniques have been introduced to address the training. Though BN accelerates the training in the beginning, our experiments show that the use of BN can be unstable and negatively impact the quality of the trained model. The evaluation of BN and numerous other recent schemes for improving GAN training is hindered by the lack of an effective objective quality measure for GAN models. To address these issues, we first introduce a weight normalization (WN) approach for GAN training that significantly improves the stability, efficiency and the quality of the generated samples. To allow a methodical evaluation, we introduce squared Euclidean reconstruction error on a test set as a new objective measure, to assess training performance in terms of speed, stability, and quality of generated samples. Our experiments with a standard DCGAN architecture on commonly used datasets (CelebA, LSUN bedroom, and CIFAR-10) indicate that training using WN is generally superior to BN for GANs, achieving 10 qualitative results than BN. We further demonstrate the stability of WN on a 21-layer ResNet trained with the CelebA data set. The code for this paper is available at https://github.com/stormraiser/gan-weightnorm-resnet

READ FULL TEXT

page 14

page 15

page 17

page 19

page 23

page 26

page 27

page 28

research
08/19/2020

Regularization And Normalization For Generative Adversarial Networks: A Review

Generative adversarial networks(GANs) is a popular generative model. Wit...
research
06/01/2018

Whitening and Coloring transform for GANs

Batch Normalization (BN) is a common technique used both in discriminati...
research
02/28/2021

Ultra-Data-Efficient GAN Training: Drawing A Lottery Ticket First, Then Training It Toughly

Training generative adversarial networks (GANs) with limited data genera...
research
09/19/2021

JEM++: Improved Techniques for Training JEM

Joint Energy-based Model (JEM) is a recently proposed hybrid model that ...
research
01/19/2023

Interpreting CNN Predictions using Conditional Generative Adversarial Networks

We propose a novel method that trains a conditional Generative Adversari...
research
01/28/2019

Out-of-Sample Testing for GANs

We propose a new method to evaluate GANs, namely EvalGAN. EvalGAN relies...
research
09/10/2021

Instance-Conditioned GAN

Generative Adversarial Networks (GANs) can generate near photo realistic...

Please sign up or login with your details

Forgot password? Click here to reset