The Trimmed Lasso: Sparsity and Robustness

08/15/2017
by   Dimitris Bertsimas, et al.
0

Nonconvex penalty methods for sparse modeling in linear regression have been a topic of fervent interest in recent years. Herein, we study a family of nonconvex penalty functions that we call the trimmed Lasso and that offers exact control over the desired level of sparsity of estimators. We analyze its structural properties and in doing so show the following: 1) Drawing parallels between robust statistics and robust optimization, we show that the trimmed-Lasso-regularized least squares problem can be viewed as a generalized form of total least squares under a specific model of uncertainty. In contrast, this same model of uncertainty, viewed instead through a robust optimization lens, leads to the convex SLOPE (or OWL) penalty. 2) Further, in relating the trimmed Lasso to commonly used sparsity-inducing penalty functions, we provide a succinct characterization of the connection between trimmed-Lasso- like approaches and penalty functions that are coordinate-wise separable, showing that the trimmed penalties subsume existing coordinate-wise separable penalties, with strict containment in general. 3) Finally, we describe a variety of exact and heuristic algorithms, both existing and new, for trimmed Lasso regularized estimation problems. We include a comparison between the different approaches and an accompanying implementation of the algorithms.

READ FULL TEXT
research
10/04/2010

Regularizers for Structured Sparsity

We study the problem of learning a sparse linear regression vector under...
research
11/09/2022

Sparse Bayesian Lasso via a Variable-Coefficient ℓ_1 Penalty

Modern statistical learning algorithms are capable of amazing flexibilit...
research
12/17/2013

The Bernstein Function: A Unifying Framework of Nonconvex Penalization in Sparse Estimation

In this paper we study nonconvex penalization using Bernstein functions....
research
09/08/2018

Computational Sufficiency, Reflection Groups, and Generalized Lasso Penalties

We study estimators with generalized lasso penalties within the computat...
research
07/22/2013

Kinetic Energy Plus Penalty Functions for Sparse Estimation

In this paper we propose and study a family of sparsity-inducing penalty...
research
12/26/2019

Second order Poincaré inequalities and de-biasing arbitrary convex regularizers when p/n → γ

A new Central Limit Theorem (CLT) is developed for random variables of t...
research
03/11/2015

Optimal prediction for sparse linear models? Lower bounds for coordinate-separable M-estimators

For the problem of high-dimensional sparse linear regression, it is know...

Please sign up or login with your details

Forgot password? Click here to reset