An Accelerated Variance-Reduced Conditional Gradient Sliding Algorithm for First-order and Zeroth-order Optimization

09/18/2021
by   Xiyuan Wei, et al.
0

The conditional gradient algorithm (also known as the Frank-Wolfe algorithm) has recently regained popularity in the machine learning community due to its projection-free property to solve constrained problems. Although many variants of the conditional gradient algorithm have been proposed to improve performance, they depend on first-order information (gradient) to optimize. Naturally, these algorithms are unable to function properly in the field of increasingly popular zeroth-order optimization, where only zeroth-order information (function value) is available. To fill in this gap, we propose a novel Accelerated variance-Reduced Conditional gradient Sliding (ARCS) algorithm for finite-sum problems, which can use either first-order or zeroth-order information to optimize. To the best of our knowledge, ARCS is the first zeroth-order conditional gradient sliding type algorithms solving convex problems in zeroth-order optimization. In first-order optimization, the convergence results of ARCS substantially outperform previous algorithms in terms of the number of gradient query oracle. Finally we validated the superiority of ARCS by experiments on real-world datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/29/2019

A unified variance-reduced accelerated gradient method for convex optimization

We propose a novel randomized incremental gradient algorithm, namely, VA...
research
07/10/2023

New Variants of Frank-Wolfe Algorithm for Video Co-localization Problem

The co-localization problem is a model that simultaneously localizes obj...
research
05/20/2018

Communication-Efficient Projection-Free Algorithm for Distributed Optimization

Distributed optimization has gained a surge of interest in recent years....
research
01/28/2022

Adaptive Accelerated (Extra-)Gradient Methods with Variance Reduction

In this paper, we study the finite-sum convex optimization problem focus...
research
06/15/2020

Improved Complexities for Stochastic Conditional Gradient Methods under Interpolation-like Conditions

We analyze stochastic conditional gradient type methods for constrained ...
research
04/15/2023

Stochastic Distributed Optimization under Average Second-order Similarity: Algorithms and Analysis

We study finite-sum distributed optimization problems with n-clients und...
research
06/17/2023

Distributed Accelerated Projection-Based Consensus Decomposition

With the development of machine learning and Big Data, the concepts of l...

Please sign up or login with your details

Forgot password? Click here to reset