A unified view for unsupervised representation learning with density ratio estimation: Maximization of mutual information, nonlinear ICA and nonlinear subspace estimation

01/06/2021
by   Hiroaki Sasaki, et al.
0

Unsupervised representation learning is one of the most important problems in machine learning. Recent promising methods are based on contrastive learning. However, contrastive learning often relies on heuristic ideas, and therefore it is not easy to understand what contrastive learning is doing. This paper emphasizes that density ratio estimation is a promising goal for unsupervised representation learning, and promotes understanding to contrastive learning. Our primal contribution is to theoretically show that density ratio estimation unifies three frameworks for unsupervised representation learning: Maximization of mutual information (MI), nonlinear independent component analysis (ICA) and a novel framework for estimation of a lower-dimensional nonlinear subspace proposed in this paper. This unified view clarifies under what conditions contrastive learning can be regarded as maximizing MI, performing nonlinear ICA or estimating the lower-dimensional nonlinear subspace in the proposed framework. Furthermore, we also make theoretical contributions in each of the three frameworks: We show that MI can be maximized through density ratio estimation under certain conditions, while our analysis for nonlinear ICA reveals a novel insight for recovery of the latent source components, which is clearly supported by numerical experiments. In addition, some theoretical conditions are also established to estimate a nonlinear subspace in the proposed framework. Based on the unified view, we propose two practical methods for unsupervised representation learning through density ratio estimation: The first method is an outlier-robust method for representation learning, while the second one is a sample-efficient nonlinear ICA method. Finally, we numerically demonstrate usefulness of the proposed methods in nonlinear ICA and through application to a downstream task for classification.

READ FULL TEXT
research
06/22/2020

Telescoping Density-Ratio Estimation

Density-ratio estimation via classification is a cornerstone of unsuperv...
research
06/04/2020

Info3D: Representation Learning on 3D Objects using Mutual Information Maximization and Contrastive Learning

A major endeavor of computer vision is to represent, understand and extr...
research
08/30/2023

Towards a Rigorous Analysis of Mutual Information in Contrastive Learning

Contrastive learning has emerged as a cornerstone in recent achievements...
research
11/01/2019

Robust contrastive learning and nonlinear ICA in the presence of outliers

Nonlinear independent component analysis (ICA) is a general framework fo...
research
06/14/2022

On Finite-Sample Identifiability of Contrastive Learning-Based Nonlinear Independent Component Analysis

Nonlinear independent component analysis (nICA) aims at recovering stati...
research
05/22/2018

Nonlinear ICA Using Auxiliary Variables and Generalized Contrastive Learning

Nonlinear ICA is a fundamental problem for unsupervised representation l...
research
07/04/2020

Nested Subspace Arrangement for Representation of Relational Data

Studies on acquiring appropriate continuous representations of discrete ...

Please sign up or login with your details

Forgot password? Click here to reset