Compressed Sparse Linear Regression

High-dimensional sparse linear regression is a basic problem in machine learning and statistics. Consider a linear model y = Xθ^ + w, where y ∈R^n is the vector of observations, X ∈R^n × d is the covariate matrix and w ∈R^n is an unknown noise vector. In many applications, the linear regression model is high-dimensional in nature, meaning that the number of observations n may be substantially smaller than the number of covariates d. In these cases, it is common to assume that θ^ is sparse, and the goal in sparse linear regression is to estimate this sparse θ^, given (X,y). In this paper, we study a variant of the traditional sparse linear regression problem where each of the n covariate vectors in R^d are individually projected by a random linear transformation to R^m with m ≪ d. Such transformations are commonly applied in practice for computational savings in resources such as storage space, transmission bandwidth, and processing time. Our main result shows that one can estimate θ^ with a low ℓ_2-error, even with access to only these projected covariate vectors, under some mild assumptions on the problem instance. Our approach is based on solving a variant of the popular Lasso optimization problem. While the conditions (such as the restricted eigenvalue condition on X) for success of a Lasso formulation in estimating θ^ are well-understood, we investigate conditions under which this variant of Lasso estimates θ^. As a simple consequence, our approach also provides a new way for estimating θ^ in the traditional sparse linear regression problem setting, which operates (even) under a weaker assumption on the design matrix than previously known, albeit achieving a weaker convergence bound.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/05/2013

Swapping Variables for High-Dimensional Sparse Regression with Correlated Measurements

We consider the high-dimensional sparse linear regression problem of acc...
research
07/29/2021

CAD: Debiasing the Lasso with inaccurate covariate model

We consider the problem of estimating a low-dimensional parameter in hig...
research
11/06/2019

The gradient complexity of linear regression

We investigate the computational complexity of several basic linear alge...
research
06/17/2021

On the Power of Preconditioning in Sparse Linear Regression

Sparse linear regression is a fundamental problem in high-dimensional st...
research
02/09/2015

High dimensional errors-in-variables models with dependent measurements

Suppose that we observe y ∈R^f and X ∈R^f × m in the following errors-in...
research
06/04/2007

Compressed Regression

Recent research has studied the role of sparsity in high dimensional reg...
research
07/29/2020

Truncated Linear Regression in High Dimensions

As in standard linear regression, in truncated linear regression, we are...

Please sign up or login with your details

Forgot password? Click here to reset