Stochastic Nonsmooth Convex Optimization with Heavy-Tailed Noises

by   Zijian Liu, et al.

Recently, several studies consider the stochastic optimization problem but in a heavy-tailed noise regime, i.e., the difference between the stochastic gradient and the true gradient is assumed to have a finite p-th moment (say being upper bounded by σ^p for some σ≥0) where p∈(1,2], which not only generalizes the traditional finite variance assumption (p=2) but also has been observed in practice for several different tasks. Under this challenging assumption, lots of new progress has been made for either convex or nonconvex problems, however, most of which only consider smooth objectives. In contrast, people have not fully explored and well understood this problem when functions are nonsmooth. This paper aims to fill this crucial gap by providing a comprehensive analysis of stochastic nonsmooth convex optimization with heavy-tailed noises. We revisit a simple clipping-based algorithm, whereas, which is only proved to converge in expectation but under the additional strong convexity assumption. Under appropriate choices of parameters, for both convex and strongly convex functions, we not only establish the first high-probability rates but also give refined in-expectation bounds compared with existing works. Remarkably, all of our results are optimal (or nearly optimal up to logarithmic factors) with respect to the time horizon T even when T is unknown in advance. Additionally, we show how to make the algorithm parameter-free with respect to σ, in other words, the algorithm can still guarantee convergence without any prior knowledge of σ.


page 1

page 2

page 3

page 4


Breaking the Lower Bound with (Little) Structure: Acceleration in Non-Convex Stochastic Optimization with Heavy-Tailed Noise

We consider the stochastic optimization problem with smooth but not nece...

Near-Optimal High Probability Complexity Bounds for Non-Smooth Stochastic Optimization with Heavy-Tailed Noise

Thanks to their practical efficiency and random nature of the data, stoc...

Nearly Optimal Robust Method for Convex Compositional Problems with Heavy-Tailed Noise

In this paper, we propose robust stochastic algorithms for solving conve...

Efficient Private SCO for Heavy-Tailed Data via Clipping

We consider stochastic convex optimization for heavy-tailed data with th...

High Probability Bounds for Stochastic Subgradient Schemes with Heavy Tailed Noise

In this work we study high probability bounds for stochastic subgradient...

Taming Fat-Tailed ("Heavier-Tailed” with Potentially Infinite Variance) Noise in Federated Learning

A key assumption in most existing works on FL algorithms' convergence an...

Robust learning with anytime-guaranteed feedback

Under data distributions which may be heavy-tailed, many stochastic grad...

Please sign up or login with your details

Forgot password? Click here to reset