ELF: Exact-Lipschitz Based Universal Density Approximator Flow

12/13/2021
by   Achintya Gopal, et al.
0

Normalizing flows have grown more popular over the last few years; however, they continue to be computationally expensive, making them difficult to be accepted into the broader machine learning community. In this paper, we introduce a simple one-dimensional one-layer network that has closed form Lipschitz constants; using this, we introduce a new Exact-Lipschitz Flow (ELF) that combines the ease of sampling from residual flows with the strong performance of autoregressive flows. Further, we show that ELF is provably a universal density approximator, more computationally and parameter efficient compared to a multitude of other flows, and achieves state-of-the-art performance on multiple large-scale datasets.

READ FULL TEXT

page 23

page 24

research
09/16/2020

Quasi-Autoregressive Residual (QuAR) Flows

Normalizing Flows are a powerful technique for learning and modeling pro...
research
11/14/2020

Self Normalizing Flows

Efficient gradient computation of the Jacobian determinant term is a cor...
research
10/05/2020

i-DenseNets

We introduce Invertible Dense Networks (i-DenseNets), a more parameter e...
research
02/04/2021

Invertible DenseNets with Concatenated LipSwish

We introduce Invertible Dense Networks (i-DenseNets), a more parameter e...
research
07/15/2021

On the expressivity of bi-Lipschitz normalizing flows

An invertible function is bi-Lipschitz if both the function and its inve...
research
06/05/2022

AUTM Flow: Atomic Unrestricted Time Machine for Monotonic Normalizing Flows

Nonlinear monotone transformations are used extensively in normalizing f...
research
10/15/2022

Invertible Monotone Operators for Normalizing Flows

Normalizing flows model probability distributions by learning invertible...

Please sign up or login with your details

Forgot password? Click here to reset