Lattice-Based Methods Surpass Sum-of-Squares in Clustering

12/07/2021
by   Ilias Zadik, et al.
0

Clustering is a fundamental primitive in unsupervised learning which gives rise to a rich class of computationally-challenging inference tasks. In this work, we focus on the canonical task of clustering d-dimensional Gaussian mixtures with unknown (and possibly degenerate) covariance. Recent works (Ghosh et al. '20; Mao, Wein '21; Davis, Diaz, Wang '21) have established lower bounds against the class of low-degree polynomial methods and the sum-of-squares (SoS) hierarchy for recovering certain hidden structures planted in Gaussian clustering instances. Prior work on many similar inference tasks portends that such lower bounds strongly suggest the presence of an inherent statistical-to-computational gap for clustering, that is, a parameter regime where the clustering task is statistically possible but no polynomial-time algorithm succeeds. One special case of the clustering task we consider is equivalent to the problem of finding a planted hypercube vector in an otherwise random subspace. We show that, perhaps surprisingly, this particular clustering model does not exhibit a statistical-to-computational gap, even though the aforementioned low-degree and SoS lower bounds continue to apply in this case. To achieve this, we give a polynomial-time algorithm based on the Lenstra–Lenstra–Lovasz lattice basis reduction method which achieves the statistically-optimal sample complexity of d+1 samples. This result extends the class of problems whose conjectured statistical-to-computational gaps can be "closed" by "brittle" polynomial-time algorithms, highlighting the crucial but subtle role of noise in the onset of statistical-to-computational gaps.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/05/2021

Inferring Hidden Structures in Random Graphs

We study the two inference problems of detecting and recovering an isola...
research
10/22/2021

Polynomial-Time Sum-of-Squares Can Robustly Estimate Mean and Covariance of Gaussians Optimally

In this work, we revisit the problem of estimating the mean and covarian...
research
11/04/2019

Lifting Sum-of-Squares Lower Bounds: Degree-2 to Degree-4

The degree-4 Sum-of-Squares (SoS) SDP relaxation is a powerful algorithm...
research
06/22/2023

SQ Lower Bounds for Learning Bounded Covariance GMMs

We study the complexity of learning mixtures of separated Gaussians with...
research
06/20/2021

On the Cryptographic Hardness of Learning Single Periodic Neurons

We show a simple reduction which demonstrates the cryptographic hardness...
research
05/21/2020

Computationally efficient sparse clustering

We study statistical and computational limits of clustering when the mea...
research
03/17/2018

Learning Mixtures of Product Distributions via Higher Multilinear Moments

Learning mixtures of k binary product distributions is a central problem...

Please sign up or login with your details

Forgot password? Click here to reset