Estimation with Norm Regularization

by   Arindam Banerjee, et al.

Analysis of non-asymptotic estimation error and structured statistical recovery based on norm regularized regression, such as Lasso, needs to consider four aspects: the norm, the loss function, the design matrix, and the noise model. This paper presents generalizations of such estimation error analysis on all four aspects compared to the existing literature. We characterize the restricted error set where the estimation error vector lies, establish relations between error sets for the constrained and regularized problems, and present an estimation error bound applicable to any norm. Precise characterizations of the bound is presented for isotropic as well as anisotropic subGaussian design matrices, subGaussian noise models, and convex loss functions, including least squares and generalized linear models. Generic chaining and associated results play an important role in the analysis. A key result from the analysis is that the sample complexity of all such estimators depends on the Gaussian width of a spherical cap corresponding to the restricted error set. Further, once the number of samples n crosses the required sample complexity, the estimation error decreases as c/√(n), where c depends on the Gaussian width of the unit norm ball.


page 1

page 2

page 3

page 4


Error bound of local minima and KL property of exponent 1/2 for squared F-norm regularized factorization

This paper is concerned with the squared F(robenius)-norm regularized fa...

Consistent Estimation for PCA and Sparse Regression with Oblivious Outliers

We develop machinery to design efficiently computable and consistent est...

Asymptotic Characterisation of Robust Empirical Risk Minimisation Performance in the Presence of Outliers

We study robust linear regression in high-dimension, when both the dimen...

System Identification via Nuclear Norm Regularization

This paper studies the problem of identifying low-order linear systems v...

Robust Structured Statistical Estimation via Conditional Gradient Type Methods

Structured statistical estimation problems are often solved by Condition...

Adaptive Noisy Data Augmentation for Regularized Estimation and Inference in Generalized Linear Models

We propose the AdaPtive Noise Augmentation (PANDA) procedure to regulari...

Toward L_∞-recovery of Nonlinear Functions: A Polynomial Sample Complexity Bound for Gaussian Random Fields

Many machine learning applications require learning a function with a sm...

Please sign up or login with your details

Forgot password? Click here to reset