Accelerated Randomized Coordinate Descent Methods for Stochastic Optimization and Online Learning

06/05/2018
by   Akshita Bhandari, et al.
0

We propose accelerated randomized coordinate descent algorithms for stochastic optimization and online learning. Our algorithms have significantly less per-iteration complexity than the known accelerated gradient algorithms. The proposed algorithms for online learning have better regret performance than the known randomized online coordinate descent algorithms. Furthermore, the proposed algorithms for stochastic optimization exhibit as good convergence rates as the best known randomized coordinate descent algorithms. We also show simulation results to demonstrate performance of the proposed algorithms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset