Randomized Riemannian Preconditioning for Quadratically Constrained Problems

02/05/2019
by   Boris Shustin, et al.
0

Optimization problem with quadratic equality constraints are prevalent in machine learning. Indeed, two important examples are Canonical Correlation Analysis (CCA) and Linear Discriminant Analysis (LDA). Unfortunately, methods for solving such problems typically involve computing matrix inverses and decomposition. For the aforementioned problems, these matrices are actually Gram matrices of input data matrices, and as such the computations are too expensive for large scale datasets. In this paper, we propose a sketching based approach for solving CCA and LDA that reduces the cost dependence on the input size. The proposed algorithms feature randomized preconditioning combined with Riemannian optimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2021

Faster Randomized Methods for Orthogonality Constrained Problems

Recent literature has advocated the use of randomized methods for accele...
research
04/29/2022

On global randomized block Kaczmarz algorithm for solving large-scale matrix equations

The randomized Kaczmarz algorithm is one of the most popular approaches ...
research
09/30/2021

On Riemannian Approach for Constrained Optimization Model in Extreme Classification Problems

We propose a novel Riemannian method for solving the Extreme multi-label...
research
11/07/2011

Discriminant Analysis with Adaptively Pooled Covariance

Linear and Quadratic Discriminant analysis (LDA/QDA) are common tools fo...
research
06/21/2022

Riemannian data-dependent randomized smoothing for neural networks certification

Certification of neural networks is an important and challenging problem...
research
07/26/2023

Stochastic pth root approximation of a stochastic matrix: A Riemannian optimization approach

We propose two approaches, based on Riemannian optimization, for computi...

Please sign up or login with your details

Forgot password? Click here to reset