The Proxy Step-size Technique for Regularized Optimization on the Sphere Manifold

09/05/2022
by   Fang Bai, et al.
0

We give an effective solution to the regularized optimization problem g (x) + h (x), where x is constrained on the unit sphere ‖x‖_2 = 1. Here g (·) is a smooth cost with Lipschitz continuous gradient within the unit ball {x : ‖x‖_2 ≤ 1 } whereas h (·) is typically non-smooth but convex and absolutely homogeneous, e.g., norm regularizers and their combinations. Our solution is based on the Riemannian proximal gradient, using an idea we call proxy step-size – a scalar variable which we prove is monotone with respect to the actual step-size within an interval. The proxy step-size exists ubiquitously for convex and absolutely homogeneous h(·), and decides the actual step-size and the tangent update in closed-form, thus the complete proximal gradient iteration. Based on these insights, we design a Riemannian proximal gradient method using the proxy step-size. We prove that our method converges to a critical point, guided by a line-search technique based on the g(·) cost only. The proposed method can be implemented in a couple of lines of code. We show its usefulness by applying nuclear norm, ℓ_1 norm, and nuclear-spectral norm regularization to three classical computer vision problems. The improvements are consistent and backed by numerical experiments.

READ FULL TEXT
research
04/06/2019

Convex-Concave Backtracking for Inertial Bregman Proximal Gradient Algorithms in Non-Convex Optimization

Backtracking line-search is an old yet powerful strategy for finding bet...
research
02/15/2021

On Riemannian Stochastic Approximation Schemes with Fixed Step-Size

This paper studies fixed step-size stochastic approximation (SA) schemes...
research
02/09/2015

Projected Nesterov's Proximal-Gradient Algorithm for Sparse Signal Reconstruction with a Convex Constraint

We develop a projected Nesterov's proximal-gradient (PNPG) approach for ...
research
11/24/2020

Sequential convergence of AdaGrad algorithm for smooth convex optimization

We prove that the iterates produced by, either the scalar step size vari...
research
01/12/2023

A Stochastic Proximal Polyak Step Size

Recently, the stochastic Polyak step size (SPS) has emerged as a competi...
research
06/07/2023

Achieving Consensus over Compact Submanifolds

We consider the consensus problem in a decentralized network, focusing o...
research
07/24/2018

SAAGs: Biased Stochastic Variance Reduction Methods

Stochastic optimization is one of the effective approach to deal with th...

Please sign up or login with your details

Forgot password? Click here to reset