Adaptive Stochastic Gradient Descent on the Grassmannian for Robust Low-Rank Subspace Recovery and Clustering

by   Jun He, et al.
Nanjing University of Information Science and Technology
NetEase, Inc

In this paper, we present GASG21 (Grassmannian Adaptive Stochastic Gradient for L_2,1 norm minimization), an adaptive stochastic gradient algorithm to robustly recover the low-rank subspace from a large matrix. In the presence of column outliers, we reformulate the batch mode matrix L_2,1 norm minimization with rank constraint problem as a stochastic optimization approach constrained on Grassmann manifold. For each observed data vector, the low-rank subspace S is updated by taking a gradient step along the geodesic of Grassmannian. In order to accelerate the convergence rate of the stochastic gradient method, we choose to adaptively tune the constant step-size by leveraging the consecutive gradients. Furthermore, we demonstrate that with proper initialization, the K-subspaces extension, K-GASG21, can robustly cluster a large number of corrupted data vectors into a union of subspaces. Numerical experiments on synthetic and real data demonstrate the efficiency and accuracy of the proposed algorithms even with heavy column outliers corruption.


page 2

page 9


Global Convergence of Stochastic Gradient Descent for Some Non-convex Matrix Problems

Stochastic gradient descent (SGD) on a low-rank factorization is commonl...

Riemannian stochastic variance reduced gradient

Stochastic variance reduction algorithms have recently become popular fo...

Stochastic Variance Reduced Gradient for affine rank minimization problem

We develop an efficient stochastic variance reduced gradient descent alg...

Online Robust and Adaptive Learning from Data Streams

In online learning from non-stationary data streams, it is both necessar...

Adaptive stochastic gradient algorithms on Riemannian manifolds

Adaptive stochastic gradient algorithms in the Euclidean space have attr...

Robust and Scalable Column/Row Sampling from Corrupted Big Data

Conventional sampling techniques fall short of drawing descriptive sketc...

A Novel Stochastic Gradient Descent Algorithm for Learning Principal Subspaces

Many machine learning problems encode their data as a matrix with a poss...

Please sign up or login with your details

Forgot password? Click here to reset