Closed-Loop Data Transcription to an LDR via Minimaxing Rate Reduction

11/12/2021
by   Xili Dai, et al.
1

This work proposes a new computational framework for learning an explicit generative model for real-world datasets. In particular we propose to learn a closed-loop transcription between a multi-class multi-dimensional data distribution and a linear discriminative representation (LDR) in the feature space that consists of multiple independent multi-dimensional linear subspaces. In particular, we argue that the optimal encoding and decoding mappings sought can be formulated as the equilibrium point of a two-player minimax game between the encoder and decoder. A natural utility function for this game is the so-called rate reduction, a simple information-theoretic measure for distances between mixtures of subspace-like Gaussians in the feature space. Our formulation draws inspiration from closed-loop error feedback from control systems and avoids expensive evaluating and minimizing approximated distances between arbitrary distributions in either the data space or the feature space. To a large extent, this new formulation unifies the concepts and benefits of Auto-Encoding and GAN and naturally extends them to the settings of learning a both discriminative and generative representation for multi-class and multi-dimensional real-world data. Our extensive experiments on many benchmark imagery datasets demonstrate tremendous potential of this new closed-loop formulation: under fair comparison, visual quality of the learned decoder and classification performance of the encoder is competitive and often better than existing methods based on GAN, VAE, or a combination of both. We notice that the so learned features of different classes are explicitly mapped onto approximately independent principal subspaces in the feature space; and diverse visual attributes within each class are modeled by the independent principal components within each subspace.

READ FULL TEXT

page 25

page 26

page 27

page 29

page 30

page 33

page 35

page 37

research
02/11/2022

Incremental Learning of Structured Memory via Closed-Loop Transcription

This work proposes a minimal computational model for learning a structur...
research
08/16/2017

Geometric Enclosing Networks

Training model to generate data has increasingly attracted research atte...
research
08/06/2020

Local biplots for multi-dimensional scaling, with application to the microbiome

We present local biplots, a an extension of the classic principal compon...
research
05/19/2021

Learning optimally separated class-specific subspace representations using convolutional autoencoder

In this work, we propose a novel convolutional autoencoder based archite...
research
02/18/2023

Closed-Loop Transcription via Convolutional Sparse Coding

Autoencoding has achieved great empirical success as a framework for lea...
research
10/30/2022

Unsupervised Learning of Structured Representations via Closed-Loop Transcription

This paper proposes an unsupervised method for learning a unified repres...

Please sign up or login with your details

Forgot password? Click here to reset