Effective Dimension Adaptive Sketching Methods for Faster Regularized Least-Squares Optimization

by   Jonathan Lacotte, et al.

We propose a new randomized algorithm for solving L2-regularized least-squares problems based on sketching. We consider two of the most popular random embeddings, namely, Gaussian embeddings and the Subsampled Randomized Hadamard Transform (SRHT). While current randomized solvers for least-squares optimization prescribe an embedding dimension at least greater than the data dimension, we show that the embedding dimension can be reduced to the effective dimension of the optimization problem, and still preserve high-probability convergence guarantees. In this regard, we derive sharp matrix deviation inequalities over ellipsoids for both Gaussian and SRHT embeddings. Specifically, we improve on the constant of a classical Gaussian concentration bound whereas, for SRHT embeddings, our deviation inequality involves a novel technical approach. Leveraging these bounds, we are able to design a practical and adaptive algorithm which does not require to know the effective dimension beforehand. Our method starts with an initial embedding dimension equal to 1 and, over iterations, increases the embedding dimension up to the effective one. Finally, we prove that our algorithm improves the state-of-the-art computational complexity for solving regularized least-squares problems. Further, we show numerically that it outperforms standard least-squares solvers such as the conjugate gradient method and its pre-conditioned version on several standard machine learning datasets.


page 1

page 2

page 3

page 4


Optimal Randomized First-Order Methods for Least-Squares Problems

We provide an exact analysis of a class of randomized algorithms for sol...

Faster Least Squares Optimization

We investigate randomized methods for solving overdetermined linear leas...

Fast Convex Quadratic Optimization Solvers with Adaptive Sketching-based Preconditioners

We consider least-squares problems with quadratic regularization and pro...

Extended Randomized Kaczmarz Method for Sparse Least Squares and Impulsive Noise Problems

The Extended Randomized Kaczmarz method is a well known iterative scheme...

Regularized Momentum Iterative Hessian Sketch for Large Scale Linear System of Equations

In this article, Momentum Iterative Hessian Sketch (M-IHS) techniques, a...

Robust Partially-Compressed Least-Squares

Randomized matrix compression techniques, such as the Johnson-Lindenstra...

Concentration of Non-Isotropic Random Tensors with Applications to Learning and Empirical Risk Minimization

Dimension is an inherent bottleneck to some modern learning tasks, where...

Please sign up or login with your details

Forgot password? Click here to reset