Metrics for Deep Generative Models

11/03/2017
by   Nutan Chen, et al.
0

Neural samplers such as variational autoencoders (VAEs) or generative adversarial networks (GANs) approximate distributions by transforming samples from a simple random source---the latent space---to samples from a more complex distribution represented by a dataset. While the manifold hypothesis implies that the density induced by a dataset contains large regions of low density, the training criterions of VAEs and GANs will make the latent space densely covered. Consequently points that are separated by low-density regions in observation space will be pushed together in latent space, making stationary distances poor proxies for similarity. We transfer ideas from Riemannian geometry to this setting, letting the distance between two points be the shortest path on a Riemannian manifold induced by the transformation. The method yields a principled distance measure, provides a tool for visual inspection of deep generative models, and an alternative to linear interpolation in latent space. In addition, it can be applied for robot movement generalization using previously learned skills. The method is evaluated on a synthetic dataset with known ground truth; on a simulated robot arm dataset; on human motion capture data; and on a generative model of handwritten digits.

READ FULL TEXT

page 5

page 6

page 7

page 8

research
03/02/2022

Discriminating Against Unrealistic Interpolations in Generative Adversarial Networks

Interpolations in the latent space of deep generative models is one of t...
research
12/19/2018

Fast Approximate Geodesics for Deep Generative Models

The length of the geodesic between two data points along the Riemannian ...
research
07/06/2023

A Privacy-Preserving Walk in the Latent Space of Generative Models for Medical Applications

Generative Adversarial Networks (GANs) have demonstrated their ability t...
research
01/20/2019

Data Interpolations in Deep Generative Models under Non-Simply-Connected Manifold Topology

Exploiting the deep generative model's remarkable ability of learning th...
research
02/26/2020

Max-Affine Spline Insights into Deep Generative Networks

We connect a large class of Generative Deep Networks (GDNs) with spline ...
research
02/09/2021

Using Deep LSD to build operators in GANs latent space with meaning in real space

Generative models rely on the key idea that data can be represented in t...
research
11/29/2019

Transflow Learning: Repurposing Flow Models Without Retraining

It is well known that deep generative models have a rich latent space, a...

Please sign up or login with your details

Forgot password? Click here to reset