Nonsmooth Optimization over Stiefel Manifold: Riemannian Subgradient Methods

11/12/2019
by   Xiao Li, et al.
0

Nonsmooth Riemannian optimization is a still under explored subfield of manifold optimization. In this paper, we study optimization problems over the Stiefel manifold with nonsmooth objective function. This type of problems appears widely in the engineering field. We propose to address these problems with Riemannian subgradient type methods including: Riemannian full, incremental, and stochastic subgradient methods. When the objective function is weakly convex, we show iteration complexity O(ε^-4) for these algorithms to achieve an ε-small surrogate stationary measure. Moreover, local linear convergence can be achieved for Riemannian full and incremental subgradient methods if the optimization problem further satisfies the sharpness regularity property. The fundamental ingredient for establishing the aforementioned convergence results is that any locally Lipschitz continuous weakly convex function in the Euclidean space admits a Riemannian subgradient inequality uniformly over the Stiefel manifold, which is of independent interest. We then extend our convergence results to a broader class of compact Riemannian manifolds embedded in Euclidean space. Finally, as a demonstration of applications, we discuss the sharpness property for robust subspace recovery and orthogonal dictionary learning and conduct experiments on the two problems to illustrate the performance of our algorithms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset