Optimal and Adaptive Monteiro-Svaiter Acceleration

by   Yair Carmon, et al.

We develop a variant of the Monteiro-Svaiter (MS) acceleration framework that removes the need to solve an expensive implicit equation at every iteration. Consequently, for any p≥ 2 we improve the complexity of convex optimization with Lipschitz pth derivative by a logarithmic factor, matching a lower bound. We also introduce an MS subproblem solver that requires no knowledge of problem parameters, and implement it as either a second- or first-order method by solving linear systems or applying MinRes, respectively. On logistic regression our method outperforms previous second-order momentum methods, but under-performs Newton's method; simply iterating our first-order adaptive subproblem solver performs comparably to L-BFGS.


page 1

page 2

page 3

page 4


Generalized Optimistic Methods for Convex-Concave Saddle Point Problems

The optimistic gradient method has seen increasing popularity as an effi...

Second-Order Kernel Online Convex Optimization with Adaptive Sketching

Kernel online convex optimization (KOCO) is a framework combining the ex...

Distributed Second-order Convex Optimization

Convex optimization problems arise frequently in diverse machine learnin...

Super-Universal Regularized Newton Method

We analyze the performance of a variant of Newton method with quadratic ...

First and zeroth-order implementations of the regularized Newton method with lazy approximated Hessians

In this work, we develop first-order (Hessian-free) and zero-order (deri...

A nonsmooth primal-dual method with simultaneous adaptive PDE constraint solver

We introduce an efficient first-order primal-dual method for the solutio...

Differentially Private Image Classification from Features

Leveraging transfer learning has recently been shown to be an effective ...

Code Repositories

Please sign up or login with your details

Forgot password? Click here to reset