Neural Autoregressive Flows

04/03/2018
by   Chin-Wei Huang, et al.
0

Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time, via Inverse Autoregressive Flows (IAF). We unify and generalize these approaches, replacing the (conditionally) affine univariate transformations of MAF/IAF with a more general class of invertible univariate transformations expressed as monotonic neural networks. We demonstrate that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions, and their greater expressivity allows them to better capture multimodal target distributions. Experimentally, NAF yields state-of-the-art performance on a suite of density estimation tasks and outperforms IAF in variational autoencoders trained on binarized MNIST.

READ FULL TEXT

page 2

page 7

08/14/2019

Unconstrained Monotonic Neural Networks

Monotonic neural networks have recently been proposed as a way to define...
03/22/2017

LogitBoost autoregressive networks

Multivariate binary distributions can be decomposed into products of uni...
03/04/2020

Gaussianization Flows

Iterative Gaussianization is a fixed-point iteration procedure that can ...
05/07/2019

Sum-of-Squares Polynomial Flow

Triangular map is a recent construct in probability theory that allows o...
12/09/2021

Autoregressive Quantile Flows for Predictive Uncertainty Estimation

Numerous applications of machine learning involve predicting flexible pr...
02/23/2023

On the curse of dimensionality for Normalizing Flows

Normalizing Flows have emerged as a powerful brand of generative models,...

Please sign up or login with your details

Forgot password? Click here to reset