Finding Global Minima via Kernel Approximations

12/22/2020
by   Alessandro Rudi, et al.
0

We consider the global minimization of smooth functions based solely on function evaluations. Algorithms that achieve the optimal number of function evaluations for a given precision level typically rely on explicitly constructing an approximation of the function which is then minimized with algorithms that have exponential running-time complexity. In this paper, we consider an approach that jointly models the function to approximate and finds a global minimum. This is done by using infinite sums of square smooth functions and has strong links with polynomial sum-of-squares hierarchies. Leveraging recent representation properties of reproducing kernel Hilbert spaces, the infinite-dimensional optimization problem can be solved by subsampling in time polynomial in the number of function evaluations, and with theoretical guarantees on the obtained minimum. Given n samples, the computational cost is O(n^3.5) in time, O(n^2) in space, and we achieve a convergence rate to the global optimum that is O(n^-m/d + 1/2 + 3/d) where m is the degree of differentiability of the function and d the number of dimensions. The rate is nearly optimal in the case of Sobolev functions and more generally makes the proposed method particularly suitable for functions that have a large number of derivatives. Indeed, when m is in the order of d, the convergence rate to the global optimum does not suffer from the curse of dimensionality, which affects only the worst-case constants (that we track explicitly through the paper).

READ FULL TEXT
research
01/30/2023

Infinite-Variate L^2-Approximation with Nested Subspace Sampling

We consider L^2-approximation on weighted reproducing kernel Hilbert spa...
research
11/08/2011

The theory and application of penalized methods or Reproducing Kernel Hilbert Spaces made easy

The popular cubic smoothing spline estimate of a regression function ari...
research
10/28/2021

Fighting the curse of dimensionality: A machine learning approach to finding global optima

Finding global optima in high-dimensional optimization problems is extre...
research
01/16/2023

Approximation of optimization problems with constraints through kernel Sum-Of-Squares

Handling an infinite number of inequality constraints in infinite-dimens...
research
11/03/2020

Function values are enough for L_2-approximation: Part II

In the first part we have shown that, for L_2-approximation of functions...
research
03/25/2022

On efficient algorithms for computing near-best polynomial approximations to high-dimensional, Hilbert-valued functions from limited samples

Sparse polynomial approximation has become indispensable for approximati...
research
03/26/2023

Convergence rates for sums-of-squares hierarchies with correlative sparsity

This work derives upper bounds on the convergence rate of the moment-sum...

Please sign up or login with your details

Forgot password? Click here to reset