Latent Traversals in Generative Models as Potential Flows

04/25/2023
by   Yue Song, et al.
0

Despite the significant recent progress in deep generative models, the underlying structure of their latent spaces is still poorly understood, thereby making the task of performing semantically meaningful latent traversals an open research challenge. Most prior work has aimed to solve this challenge by modeling latent structures linearly, and finding corresponding linear directions which result in `disentangled' generations. In this work, we instead propose to model latent structures with a learned dynamic potential landscape, thereby performing latent traversals as the flow of samples down the landscape's gradient. Inspired by physics, optimal transport, and neuroscience, these potential landscapes are learned as physically realistic partial differential equations, thereby allowing them to flexibly vary over both space and time. To achieve disentanglement, multiple potentials are learned simultaneously, and are constrained by a classifier to be distinct and semantically self-consistent. Experimentally, we demonstrate that our method achieves both more qualitatively and quantitatively disentangled trajectories than state-of-the-art baselines. Further, we demonstrate that our method can be integrated as a regularization term during training, thereby acting as an inductive bias towards the learning of structured representations, ultimately improving model likelihood on similarly structured data.

READ FULL TEXT

page 5

page 6

page 7

page 8

page 9

page 15

page 16

research
12/04/2018

Deep Generative Modeling of LiDAR Data

Building models capable of generating structured output is a key challen...
research
12/01/2020

Refining Deep Generative Models via Wasserstein Gradient Flows

Deep generative modeling has seen impressive advances in recent years, t...
research
06/29/2022

Manifold Interpolating Optimal-Transport Flows for Trajectory Inference

Here, we present a method called Manifold Interpolating Optimal-Transpor...
research
12/10/2018

Disentangled Dynamic Representations from Unordered Data

We present a deep generative model that learns disentangled static and d...
research
12/04/2019

Informative GANs via Structured Regularization of Optimal Transport

We tackle the challenge of disentangled representation learning in gener...
research
07/21/2020

Learning Structured Latent Factors from Dependent Data:A Generative Model Framework from Information-Theoretic Perspective

Learning controllable and generalizable representation of multivariate d...

Please sign up or login with your details

Forgot password? Click here to reset