Minimax Efficient Finite-Difference Stochastic Gradient Estimators Using Black-Box Function Evaluations

07/08/2020
by   Henry Lam, et al.
0

We consider stochastic gradient estimation using noisy black-box function evaluations. A standard approach is to use the finite-difference method or its variants. While natural, it is open to our knowledge whether its statistical accuracy is the best possible. This paper argues so by showing that central finite-difference is a nearly minimax optimal zeroth-order gradient estimator, among both the class of linear estimators and the much larger class of all (nonlinear) estimators.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset