On the Randomized Complexity of Minimizing a Convex Quadratic Function

07/24/2018
by   Max Simchowitz, et al.
0

Minimizing a convex, quadratic objective is a fundamental problem in machine learning and optimization. In this work, we study prove information-theoretic gradient-query complexity lower bounds for minimizing convex quadratic functions, which, unlike prior works, apply even for randomized algorithms. Specifically, we construct a distribution over quadratic functions that witnesses lower bounds which match those known for deterministic algorithms, up to multiplicative constants. The distribution which witnesses our lower bound is in fact quite benign: it is both closed form, and derived from classical ensembles in random matrix theory. We believe that our construction constitutes a plausible "average case" setting, and thus provides compelling evidence that the worst case and average case complexity of convex-quadratic optimization are essentially identical.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset