Sparse recovery via nonconvex regularized M-estimators over ℓ_q-balls

11/19/2019
by   Xin Li, et al.
0

In this paper, we analyse the recovery properties of nonconvex regularized M-estimators, under the assumption that the true parameter is of soft sparsity. In the statistical aspect, we establish the recovery bound for any stationary point of the nonconvex regularized M-estimator, under restricted strong convexity and some regularity conditions on the loss function and the regularizer, respectively. In the algorithmic aspect, we slightly decompose the objective function and then solve the nonconvex optimization problem via the proximal gradient method, which is proved to achieve a linear convergence rate. In particular, we note that for commonly-used regularizers such as SCAD and MCP, a simpler decomposition is applicable thanks to our assumption on the regularizer, which helps to construct the estimator with better recovery performance. Finally, we demonstrate our theoretical consequences and the advantage of the assumption by several numerical experiments on the corrupted errors-in-variables linear regression model. Simulation results show remarkable consistency with our theory under high-dimensional scaling.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset