Subspace clustering in high-dimensions: Phase transitions & Statistical-to-Computational gap

05/26/2022
by   Luca Pesce, et al.
2

A simple model to study subspace clustering is the high-dimensional k-Gaussian mixture model where the cluster means are sparse vectors. Here we provide an exact asymptotic characterization of the statistically optimal reconstruction error in this model in the high-dimensional regime with extensive sparsity, i.e. when the fraction of non-zero components of the cluster means ρ, as well as the ratio α between the number of samples and the dimension are fixed, while the dimension diverges. We identify the information-theoretic threshold below which obtaining a positive correlation with the true cluster means is statistically impossible. Additionally, we investigate the performance of the approximate message passing (AMP) algorithm analyzed via its state evolution, which is conjectured to be optimal among polynomial algorithm for this task. We identify in particular the existence of a statistical-to-computational gap between the algorithm that require a signal-to-noise ratio λ_alg≥ k / √(α) to perform better than random, and the information theoretic threshold at λ_it≈√(-k ρlogρ) / √(α). Finally, we discuss the case of sub-extensive sparsity ρ by comparing the performance of the AMP with other sparsity-enhancing algorithms, such as sparse-PCA and diagonal thresholding.

READ FULL TEXT

page 3

page 12

page 14

research
02/13/2023

Optimal Algorithms for the Inhomogeneous Spiked Wigner Model

In this paper, we study a spiked Wigner problem with an inhomogeneous no...
research
06/14/2020

All-or-nothing statistical and computational phase transitions in sparse spiked matrix estimation

We determine statistical and computational limits for estimation of a ra...
research
10/10/2016

Phase transitions and optimal algorithms in high-dimensional Gaussian mixture clustering

We consider the problem of Gaussian mixture clustering in the high-dimen...
research
06/16/2013

Do semidefinite relaxations solve sparse PCA up to the information limit?

Estimating the leading principal components of data, assuming they are s...
research
08/14/2021

On Support Recovery with Sparse CCA: Information Theoretic and Computational Limits

In this paper we consider asymptotically exact support recovery in the c...
research
05/15/2018

On the glassy nature of the hard phase in inference problems

An algorithmically hard phase was described in a range of inference prob...
research
02/26/2020

The role of regularization in classification of high-dimensional noisy Gaussian mixture

We consider a high-dimensional mixture of two Gaussians in the noisy reg...

Please sign up or login with your details

Forgot password? Click here to reset