The folded concave Laplacian spectral penalty learns block diagonal sparsity patterns with the strong oracle property

by   Iain Carmichael, et al.

Structured sparsity is an important part of the modern statistical toolkit. We say set of model parameters has block diagonal sparsity up to permutations if its elements can be viewed as the edges of a graph that has multiple connected components. For example, a block diagonal correlation matrix with K blocks of variables corresponds to a graph with K connected components whose nodes are the variables and whose edges are the correlations. This type of sparsity captures clusters of model parameters. To learn block diagonal sparsity patterns we develop the folded concave Laplacian spectral penalty and provide a majorization-minimization algorithm for the resulting non-convex problem. We show this algorithm has the appealing computational and statistical guarantee of converging to the oracle estimator after two steps with high probability, even in high-dimensional settings. The theory is then demonstrated in several classical problems including covariance estimation, linear regression, and logistic regression.


Strong oracle optimality of folded concave penalized estimation

Folded concave penalization methods have been shown to enjoy the strong ...

Non-asymptotic model selection in block-diagonal mixture of polynomial experts models

Model selection, via penalized likelihood type criteria, is a standard t...

Learning Graph Laplacian with MCP

Motivated by the observation that the ability of the ℓ_1 norm in promoti...

Learning Graphs with Monotone Topology Properties and Multiple Connected Components

Learning graphs with topology properties is a non-convex optimization pr...

Generalization Bounds for High-dimensional M-estimation under Sparsity Constraint

The ℓ_0-constrained empirical risk minimization (ℓ_0-ERM) is a promising...

Please sign up or login with your details

Forgot password? Click here to reset