High-Dimensional Optimization in Adaptive Random Subspaces

06/27/2019
by   Jonathan Lacotte, et al.
0

We propose a new randomized optimization method for high-dimensional problems which can be seen as a generalization of coordinate descent to random subspaces. We show that an adaptive sampling strategy for the random subspace significantly outperforms the oblivious sampling method, which is the common choice in the recent literature. The adaptive subspace can be efficiently generated by a correlated random matrix ensemble whose statistics mimic the input data. We prove that the improvement in the relative error of the solution can be tightly characterized in terms of the spectrum of the data matrix, and provide probabilistic upper-bounds. We then illustrate the consequences of our theory with data matrices of different spectral decay. Extensive experimental results show that the proposed approach offers significant speed ups in machine learning problems including logistic regression, kernel classification with random convolution layers and shallow neural networks with rectified linear units. Our analysis is based on convex analysis and Fenchel duality, and establishes connections to sketching and randomized matrix decomposition.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/13/2020

Adaptive and Oblivious Randomized Subspace Methods for High-Dimensional Optimization: Sharp Analysis and Lower Bounds

We propose novel randomized optimization methods for high-dimensional co...
research
03/06/2020

Random sampling in weighted reproducing kernel subspaces of L^p_ν(R^d)

In this paper, we mainly study the random sampling and reconstruction fo...
research
02/01/2015

High Dimensional Low Rank plus Sparse Matrix Decomposition

This paper is concerned with the problem of low rank plus sparse matrix ...
research
02/05/2020

Improved Subsampled Randomized Hadamard Transform for Linear SVM

Subsampled Randomized Hadamard Transform (SRHT), a popular random projec...
research
04/22/2015

Spectral Norm of Random Kernel Matrices with Applications to Privacy

Kernel methods are an extremely popular set of techniques used for many ...
research
10/01/2016

Convergence of a Grassmannian Gradient Descent Algorithm for Subspace Estimation From Undersampled Data

Subspace learning and matrix factorization problems have a great many ap...
research
06/16/2020

RaSE: Random Subspace Ensemble Classification

We propose a new model-free ensemble classification framework, Random Su...

Please sign up or login with your details

Forgot password? Click here to reset