Minimax bounds for estimating multivariate Gaussian location mixtures

12/01/2020
by   Arlene K. H. Kim, et al.
0

We prove minimax bounds for estimating Gaussian location mixtures on ℝ^d under the squared L^2 and the squared Hellinger loss functions. Under the squared L^2 loss, we prove that the minimax rate is upper and lower bounded by a constant multiple of n^-1(log n)^d/2. Under the squared Hellinger loss, we consider two subclasses based on the behavior of the tails of the mixing measure. When the mixing measure has a sub-Gaussian tail, the minimax rate under the squared Hellinger loss is bounded from below by (log n)^d/n. On the other hand, when the mixing measure is only assumed to have a bounded p^th moment for a fixed p > 0, the minimax rate under the squared Hellinger loss is bounded from below by n^-p/(p+d)(log n)^-3d/2. These rates are minimax optimal up to logarithmic factors.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset