Optimization of Smooth Functions with Noisy Observations: Local Minimax Rates

03/22/2018
by   Yining Wang, et al.
0

We consider the problem of global optimization of an unknown non-convex smooth function with zeroth-order feedback. In this setup, an algorithm is allowed to adaptively query the underlying function at different locations and receives noisy evaluations of function values at the queried points (i.e. the algorithm has access to zeroth-order information). Optimization performance is evaluated by the expected difference of function values at the estimated optimum and the true optimum. In contrast to the classical optimization setup, first-order information like gradients are not directly accessible to the optimization algorithm. We show that the classical minimax framework of analysis, which roughly characterizes the worst-case query complexity of an optimization algorithm in this setting, leads to excessively pessimistic results. We propose a local minimax framework to study the fundamental difficulty of optimizing smooth functions with adaptive function evaluations, which provides a refined picture of the intrinsic difficulty of zeroth-order optimization. We show that for functions with fast level set growth around the global minimum, carefully designed optimization algorithms can identify a near global minimizer with many fewer queries. For the special case of strongly convex and smooth functions, our implied convergence rates match the ones developed for zeroth-order convex optimization problems. At the other end of the spectrum, for worst-case smooth functions no algorithm can converge faster than the minimax rate of estimating the entire unknown function in the ℓ_∞-norm. We provide an intuitive and efficient algorithm that attains the derived upper error bounds.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/05/2021

Adapting to Function Difficulty and Growth Conditions in Private Optimization

We develop algorithms for private stochastic convex optimization that ad...
research
12/07/2013

Optimal rates for zero-order convex optimization: the power of two function evaluations

We consider derivative-free algorithms for stochastic and non-stochastic...
research
05/24/2016

Local Minimax Complexity of Stochastic Convex Optimization

We extend the traditional worst-case, minimax analysis of stochastic con...
research
10/01/2018

A simple parameter-free and adaptive approach to optimization under a minimal local smoothness assumption

We study the problem of optimizing a function under a budgeted number of...
research
03/07/2017

Global optimization of Lipschitz functions

The goal of the paper is to design sequential strategies which lead to e...
research
10/29/2017

Stochastic Zeroth-order Optimization in High Dimensions

We consider the problem of optimizing a high-dimensional convex function...
research
04/30/2005

A New Kind of Hopfield Networks for Finding Global Optimum

The Hopfield network has been applied to solve optimization problems ove...

Please sign up or login with your details

Forgot password? Click here to reset