Localized Lasso for High-Dimensional Regression

03/22/2016
by   Makoto Yamada, et al.
0

We introduce the localized Lasso, which is suited for learning models that are both interpretable and have a high predictive power in problems with high dimensionality d and small sample size n. More specifically, we consider a function defined by local sparse models, one at each data point. We introduce sample-wise network regularization to borrow strength across the models, and sample-wise exclusive group sparsity (a.k.a., ℓ_1,2 norm) to introduce diversity into the choice of feature sets in the local models. The local models are interpretable in terms of similarity of their sparsity patterns. The cost function is convex, and thus has a globally optimal solution. Moreover, we propose a simple yet efficient iterative least-squares based optimization procedure for the localized Lasso, which does not need a tuning parameter, and is guaranteed to converge to a globally optimal solution. The solution is empirically shown to outperform alternatives for both simulated and genomic personalized medicine data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/25/2011

Fast global convergence of gradient methods for high-dimensional statistical recovery

Many statistical M-estimators are based on convex optimization problems ...
research
08/08/2020

Error Bounds for Generalized Group Sparsity

In high-dimensional statistical inference, sparsity regularizations have...
research
06/09/2020

On Coresets For Regularized Regression

We study the effect of norm based regularization on the size of coresets...
research
06/26/2011

A General Framework for Structured Sparsity via Proximal Optimization

We study a generalized framework for structured sparsity. It extends the...
research
10/13/2017

Enumerating Multiple Equivalent Lasso Solutions

Predictive modelling is a data-analysis task common in many scientific f...
research
06/11/2020

Safe Screening Rules for Generalized Double Sparsity Learning

In a high-dimensional setting, sparse model has shown its power in compu...
research
11/18/2016

Finding Alternate Features in Lasso

We propose a method for finding alternate features missing in the Lasso ...

Please sign up or login with your details

Forgot password? Click here to reset