Inspired by a recent breakthrough of Mishchenko et al (2022), who for th...
One of the first steps during the investigation of geological objects is...
We consider the task of minimizing the sum of smooth and strongly convex...
We propose ADOM - an accelerated method for smooth and strongly convex
d...
We propose a family of lossy integer compressions for Stochastic Gradien...
We consider distributed convex-concave saddle point problems over arbitr...
Decentralized optimization methods enable on-device training of machine
...
In this paper, we propose a unified analysis of variants of distributed ...
We consider the task of decentralized minimization of the sum of smooth
...
Most algorithms for solving optimization problems or finding saddle poin...
Due to the high communication cost in distributed and federated learning...
Since the late 1950's when quasi-Newton methods first appeared, they hav...
We propose an accelerated version of stochastic variance reduced coordin...
We propose basic and natural assumptions under which iterative optimizat...
We present two new remarkably simple stochastic second-order methods for...
We propose a new algorithm---Stochastic Proximal Langevin Algorithm
(SPL...
We consider a new extension of the extragradient method that is motivate...
The stochastic variance-reduced gradient method (SVRG) and its accelerat...