GEP-MSCRA for computing the group zero-norm regularized least squares estimator

by   Shujun Bi, et al.

This paper concerns with the group zero-norm regularized least squares estimator which, in terms of the variational characterization of the zero-norm, can be obtained from a mathematical program with equilibrium constraints (MPEC). By developing the global exact penalty for the MPEC, this estimator is shown to arise from an exact penalization problem that not only has a favorable bilinear structure but also implies a recipe to deliver equivalent DC estimators such as the SCAD and MCP estimators. We propose a multi-stage convex relaxation approach (GEP-MSCRA) for computing this estimator, and under a restricted strong convexity assumption on the design matrix, establish its theoretical guarantees which include the decreasing of the error bounds for the iterates to the true coefficient vector and the coincidence of the iterates after finite steps with the oracle estimator. Finally, we implement the GEP-MSCRA with the subproblems solved by a semismooth Newton augmented Lagrangian method (ALM) and compare its performance with that of SLEP and MALSAR, the solvers for the weighted ℓ_2,1-norm regularized estimator, on synthetic group sparse regression problems and real multi-task learning problems. Numerical comparison indicates that the GEP-MSCRA has significant advantage in reducing error and achieving better sparsity than the SLEP and the MALSAR do.


page 1

page 2

page 3

page 4


Equivalent Lipschitz surrogates for zero-norm and rank optimization problems

This paper proposes a mechanism to produce equivalent Lipschitz surrogat...

The Adaptive τ-Lasso: Its Robustness and Oracle Properties

This paper introduces a new regularized version of the robust τ-regressi...

Nuclear Norm Regularized Estimation of Panel Regression Models

In this paper we investigate panel regression models with interactive fi...

Error bound of local minima and KL property of exponent 1/2 for squared F-norm regularized factorization

This paper is concerned with the squared F(robenius)-norm regularized fa...

KL property of exponent 1/2 of ℓ_2,0-norm and DC regularized factorizations for low-rank matrix recovery

This paper is concerned with the factorization form of the rank regulari...

Robust Nonparametric Regression via Sparsity Control with Application to Load Curve Data Cleansing

Nonparametric methods are widely applicable to statistical inference pro...

Error bounds for sparse classifiers in high-dimensions

We prove an L2 recovery bound for a family of sparse estimators defined ...

Please sign up or login with your details

Forgot password? Click here to reset