Nonlinear matrix recovery using optimization on the Grassmann manifold

09/13/2021
by   Florentin Goyens, et al.
0

We investigate the problem of recovering a partially observed high-rank matrix whose columns obey a nonlinear structure such as a union of subspaces, an algebraic variety or grouped in clusters. The recovery problem is formulated as the rank minimization of a nonlinear feature map applied to the original matrix, which is then further approximated by a constrained non-convex optimization problem involving the Grassmann manifold. We propose two sets of algorithms, one arising from Riemannian optimization and the other as an alternating minimization scheme, both of which include first- and second-order variants. Both sets of algorithms have theoretical guarantees. In particular, for the alternating minimization, we establish global convergence and worst-case complexity bounds. Additionally, using the Kurdyka-Lojasiewicz property, we show that the alternating minimization converges to a unique limit point. We provide extensive numerical results for the recovery of union of subspaces and clustering under entry sampling and dense Gaussian sampling. Our methods are competitive with existing approaches and, in particular, high accuracy is achieved in the recovery using Riemannian second-order methods.

READ FULL TEXT

page 32

page 33

page 34

research
02/26/2020

Recommendation on a Budget: Column Space Recovery from Partially Observed Entries with Random or Active Sampling

We analyze alternating minimization for column space recovery of a parti...
research
12/03/2012

Low-rank Matrix Completion using Alternating Minimization

Alternating minimization represents a widely applicable and empirically ...
research
06/01/2023

Gauss-Southwell type descent methods for low-rank matrix optimization

We consider gradient-related methods for low-rank matrix optimization wi...
research
10/23/2021

On Geometric Connections of Embedded and Quotient Geometries in Riemannian Fixed-rank Matrix Optimization

In this paper, we propose a general procedure for establishing the lands...
research
02/20/2020

Second-order Conditional Gradients

Constrained second-order convex optimization algorithms are the method o...
research
06/28/2021

Asymptotic Log-Det Rank Minimization via (Alternating) Iteratively Reweighted Least Squares

The affine rank minimization (ARM) problem is well known for both its ap...
research
03/02/2015

Simple, Efficient, and Neural Algorithms for Sparse Coding

Sparse coding is a basic task in many fields including signal processing...

Please sign up or login with your details

Forgot password? Click here to reset