Fast stochastic dual coordinate descent algorithms for linearly constrained convex optimization

07/31/2023
by   Yun Zeng, et al.
0

The problem of finding a solution to the linear system Ax = b with certain minimization properties arises in numerous scientific and engineering areas. In the era of big data, the stochastic optimization algorithms become increasingly significant due to their scalability for problems of unprecedented size. This paper focuses on the problem of minimizing a strongly convex function subject to linear constraints. We consider the dual formulation of this problem and adopt the stochastic coordinate descent to solve it. The proposed algorithmic framework, called fast stochastic dual coordinate descent, utilizes sampling matrices sampled from user-defined distributions to extract gradient information. Moreover, it employs Polyak's heavy ball momentum acceleration with adaptive parameters learned through iterations, overcoming the limitation of the heavy ball momentum method that it requires prior knowledge of certain parameters, such as the singular values of a matrix. With these extensions, the framework is able to recover many well-known methods in the context, including the randomized sparse Kaczmarz method, the randomized regularized Kaczmarz method, the linearized Bregman iteration, and a variant of the conjugate gradient (CG) method. We prove that, with strongly admissible objective function, the proposed method converges linearly in expectation. Numerical experiments are provided to confirm our results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/30/2017

Linearly convergent stochastic heavy ball method for minimizing generalization error

In this work we establish the first linear convergence result for the st...
research
12/27/2017

Momentum and Stochastic Momentum for Stochastic Gradient, Newton, Proximal Point and Subspace Descent Methods

In this paper we study several classes of stochastic optimization algori...
research
03/11/2020

Stochastic Coordinate Minimization with Progressive Precision for Stochastic Convex Optimization

A framework based on iterative coordinate minimization (CM) is developed...
research
03/16/2023

Variational Principles for Mirror Descent and Mirror Langevin Dynamics

Mirror descent, introduced by Nemirovski and Yudin in the 1970s, is a pr...
research
09/23/2018

Accelerated Gossip via Stochastic Heavy Ball Method

In this paper we show how the stochastic heavy ball method (SHB) -- a po...
research
06/08/2015

Linear Convergence of the Randomized Feasible Descent Method Under the Weak Strong Convexity Assumption

In this paper we generalize the framework of the feasible descent method...
research
09/01/2022

Optimal Regularized Online Convex Allocation by Adaptive Re-Solving

This paper introduces a dual-based algorithm framework for solving the r...

Please sign up or login with your details

Forgot password? Click here to reset