Convergence Rates for Non-Log-Concave Sampling and Log-Partition Estimation

03/06/2023
by   David Holzmüller, et al.
0

Sampling from Gibbs distributions p(x) ∝exp(-V(x)/ε) and computing their log-partition function are fundamental tasks in statistics, machine learning, and statistical physics. However, while efficient algorithms are known for convex potentials V, the situation is much more difficult in the non-convex case, where algorithms necessarily suffer from the curse of dimensionality in the worst case. For optimization, which can be seen as a low-temperature limit of sampling, it is known that smooth functions V allow faster convergence rates. Specifically, for m-times differentiable functions in d dimensions, the optimal rate for algorithms with n function evaluations is known to be O(n^-m/d), where the constant can potentially depend on m, d and the function to be optimized. Hence, the curse of dimensionality can be alleviated for smooth functions at least in terms of the convergence rate. Recently, it has been shown that similarly fast rates can also be achieved with polynomial runtime O(n^3.5), where the exponent 3.5 is independent of m or d. Hence, it is natural to ask whether similar rates for sampling and log-partition computation are possible, and whether they can be realized in polynomial time with an exponent independent of m and d. We show that the optimal rates for sampling and log-partition computation are sometimes equal and sometimes faster than for optimization. We then analyze various polynomial-time sampling algorithms, including an extension of a recent promising optimization approach, and find that they sometimes exhibit interesting behavior but no near-optimal rates. Our results also give further insights on the relation between sampling, log-partition, and optimization problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2022

A Proximal Algorithm for Sampling from Non-convex Potentials

We study sampling problems associated with non-convex potentials that me...
research
12/07/2013

Optimal rates for zero-order convex optimization: the power of two function evaluations

We consider derivative-free algorithms for stochastic and non-stochastic...
research
10/13/2018

Computing the partition function of the Sherrington-Kirkpatrick model is hard on average

We consider the algorithmic problem of computing the partition function ...
research
01/08/2018

How To Make the Gradients Small Stochastically

In convex stochastic optimization, convergence rates in terms of minimiz...
research
07/13/2021

Convergence rates of vector-valued local polynomial regression

Non-parametric estimation of functions as well as their derivatives by m...
research
12/22/2018

Inference and Sampling of K_33-free Ising Models

We call an Ising model tractable when it is possible to compute its part...
research
02/02/2022

HMC and Langevin united in the unadjusted and convex case

We consider a family of unadjusted HMC samplers, which includes standard...

Please sign up or login with your details

Forgot password? Click here to reset