Stabilizing Training of Generative Adversarial Nets via Langevin Stein Variational Gradient Descent

04/22/2020
by   Dong Wang, et al.
4

Generative adversarial networks (GANs), famous for the capability of learning complex underlying data distribution, are however known to be tricky in the training process, which would probably result in mode collapse or performance deterioration. Current approaches of dealing with GANs' issues almost utilize some practical training techniques for the purpose of regularization, which on the other hand undermines the convergence and theoretical soundness of GAN. In this paper, we propose to stabilize GAN training via a novel particle-based variational inference – Langevin Stein variational gradient descent (LSVGD), which not only inherits the flexibility and efficiency of original SVGD but aims to address its instability issues by incorporating an extra disturbance into the update dynamics. We further demonstrate that by properly adjusting the noise variance, LSVGD simulates a Langevin process whose stationary distribution is exactly the target distribution. We also show that LSVGD dynamics has an implicit regularization which is able to enhance particles' spread-out and diversity. At last we present an efficient way of applying particle-based variational inference on a general GAN training procedure no matter what loss function is adopted. Experimental results on one synthetic dataset and three popular benchmark datasets – Cifar-10, Tiny-ImageNet and CelebA validate that LSVGD can remarkably improve the performance and stability of various GAN models.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 6

page 8

page 11

research
07/12/2021

Prb-GAN: A Probabilistic Framework for GAN Modelling

Generative adversarial networks (GANs) are very popular to generate real...
research
07/16/2018

Variational Inference: A Unified Framework of Generative Models and Some Revelations

We reinterpreting the variational inference in a new perspective. Via th...
research
06/13/2017

Gradient descent GAN optimization is locally stable

Despite the growing prominence of generative adversarial networks (GANs)...
research
10/29/2019

Kernel-Guided Training of Implicit Generative Models with Stability Guarantees

Modern implicit generative models such as generative adversarial network...
research
10/28/2020

Training Generative Adversarial Networks by Solving Ordinary Differential Equations

The instability of Generative Adversarial Network (GAN) training has fre...
research
05/18/2023

Augmented Message Passing Stein Variational Gradient Descent

Stein Variational Gradient Descent (SVGD) is a popular particle-based me...
research
07/06/2020

Kernel Stein Generative Modeling

We are interested in gradient-based Explicit Generative Modeling where s...

Please sign up or login with your details

Forgot password? Click here to reset