Finite-Sample Symmetric Mean Estimation with Fisher Information Rate

06/28/2023
by   Shivam Gupta, et al.
0

The mean of an unknown variance-σ^2 distribution f can be estimated from n samples with variance σ^2/n and nearly corresponding subgaussian rate. When f is known up to translation, this can be improved asymptotically to 1/nℐ, where ℐ is the Fisher information of the distribution. Such an improvement is not possible for general unknown f, but [Stone, 1975] showed that this asymptotic convergence is possible if f is symmetric about its mean. Stone's bound is asymptotic, however: the n required for convergence depends in an unspecified way on the distribution f and failure probability δ. In this paper we give finite-sample guarantees for symmetric mean estimation in terms of Fisher information. For every f, n, δ with n > log1/δ, we get convergence close to a subgaussian with variance 1/n ℐ_r, where ℐ_r is the r-smoothed Fisher information with smoothing radius r that decays polynomially in n. Such a bound essentially matches the finite-sample guarantees in the known-f setting.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset