Learning High-Dimensional Distributions with Latent Neural Fokker-Planck Kernels

05/10/2021
by   Yufan Zhou, et al.
7

Learning high-dimensional distributions is an important yet challenging problem in machine learning with applications in various domains. In this paper, we introduce new techniques to formulate the problem as solving Fokker-Planck equation in a lower-dimensional latent space, aiming to mitigate challenges in high-dimensional data space. Our proposed model consists of latent-distribution morphing, a generator and a parameterized Fokker-Planck kernel function. One fascinating property of our model is that it can be trained with arbitrary steps of latent distribution morphing or even without morphing, which makes it flexible and as efficient as Generative Adversarial Networks (GANs). Furthermore, this property also makes our latent-distribution morphing an efficient plug-and-play scheme, thus can be used to improve arbitrary GANs, and more interestingly, can effectively correct failure cases of the GAN models. Extensive experiments illustrate the advantages of our proposed method over existing models.

READ FULL TEXT

page 19

page 20

page 21

page 22

page 23

page 24

page 25

page 26

research
02/15/2018

Inverting The Generator Of A Generative Adversarial Network (II)

Generative adversarial networks (GANs) learn a deep generative model tha...
research
11/17/2016

Inverting The Generator Of A Generative Adversarial Network

Generative adversarial networks (GANs) learn to synthesise new samples f...
research
10/18/2021

Neural-network learning of SPOD latent dynamics

We aim to reconstruct the latent space dynamics of high dimensional syst...
research
12/23/2019

RPGAN: GANs Interpretability via Random Routing

In this paper, we introduce Random Path Generative Adversarial Network (...
research
07/17/2023

Complexity Matters: Rethinking the Latent Space for Generative Modeling

In generative modeling, numerous successful approaches leverage a low-di...
research
01/26/2019

Witnessing Adversarial Training in Reproducing Kernel Hilbert Spaces

Modern implicit generative models such as generative adversarial network...
research
07/08/2021

Parameterization of Forced Isotropic Turbulent Flow using Autoencoders and Generative Adversarial Networks

Autoencoders and generative neural network models have recently gained p...

Please sign up or login with your details

Forgot password? Click here to reset