Information-theoretic lower bounds on the oracle complexity of stochastic convex optimization

09/03/2010
by   Alekh Agarwal, et al.
0

Relative to the large literature on upper bounds on complexity of convex optimization, lesser attention has been paid to the fundamental hardness of these problems. Given the extensive use of convex optimization in machine learning and statistics, gaining an understanding of these complexity-theoretic issues is important. In this paper, we study the complexity of stochastic convex optimization in an oracle model of computation. We improve upon known results and obtain tight minimax complexity estimates for various function classes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/09/2023

Information Theoretic Lower Bounds for Information Theoretic Upper Bounds

We examine the relationship between the mutual information between the o...
research
02/13/2019

The Complexity of Making the Gradient Small in Stochastic Convex Optimization

We give nearly matching upper and lower bounds on the oracle complexity ...
research
07/12/2012

Optimal rates for first-order stochastic convex optimization under Tsybakov noise condition

We focus on the problem of minimizing a convex function f over a convex ...
research
12/25/2013

A Convex Formulation for Mixed Regression with Two Components: Minimax Optimal Rates

We consider the mixed regression problem with two components, under adve...
research
05/24/2016

Local Minimax Complexity of Stochastic Convex Optimization

We extend the traditional worst-case, minimax analysis of stochastic con...
research
08/12/2018

Parallelization does not Accelerate Convex Optimization: Adaptivity Lower Bounds for Non-smooth Convex Minimization

In this paper we study the limitations of parallelization in convex opti...
research
10/12/2021

Complexity of optimizing over the integers

In the first part of this paper, we present a unified framework for anal...

Please sign up or login with your details

Forgot password? Click here to reset