Stochastic Conjugate Gradient Algorithm with Variance Reduction

10/27/2017
by   Xiao-Bo Jin, et al.
0

Conjugate gradient methods are a class of important methods for solving linear equations and nonlinear optimization. In our work, we propose a new stochastic conjugate gradient algorithm with variance reduction (CGVR) and prove its linear convergence with the Fletcher and Revves method for strongly convex and smooth functions. We experimentally demonstrate that the CGVR algorithm converges faster than its counterparts for six large-scale optimization problems that may be convex, non-convex or non-smooth, and its AUC (Area Under Curve) performance with L2-regularized L2-loss is comparable to that of LIBLINEAR but with significant improvement in computational efficiency.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset