Weighted SGD for ℓ_p Regression with Randomized Preconditioning
In recent years, stochastic gradient descent (SGD) methods and randomized linear algebra (RLA) algorithms have been applied to many large-scale problems in machine learning and data analysis. We aim to bridge the gap between these two methods in solving constrained overdetermined linear regression problems---e.g., ℓ_2 and ℓ_1 regression problems. We propose a hybrid algorithm named pwSGD that uses RLA techniques for preconditioning and constructing an importance sampling distribution, and then performs an SGD-like iterative process with weighted sampling on the preconditioned system. We prove that pwSGD inherits faster convergence rates that only depend on the lower dimension of the linear system, while maintaining low computation complexity. Particularly, when solving ℓ_1 regression with size n by d, pwSGD returns an approximate solution with ϵ relative error in the objective value in O( n ·nnz(A) + poly(d)/ϵ^2) time. This complexity is uniformly better than that of RLA methods in terms of both ϵ and d when the problem is unconstrained. For ℓ_2 regression, pwSGD returns an approximate solution with ϵ relative error in the objective value and the solution vector measured in prediction norm in O( n ·nnz(A) + poly(d) (1/ϵ) /ϵ) time. We also provide lower bounds on the coreset complexity for more general regression problems, indicating that still new ideas will be needed to extend similar RLA preconditioning ideas to weighted SGD algorithms for more general regression problems. Finally, the effectiveness of such algorithms is illustrated numerically on both synthetic and real datasets.
READ FULL TEXT