Minimax Optimal Additive Functional Estimation with Discrete Distribution: Slow Divergence Speed Case

01/12/2018
by   Kazuto Fukuchi, et al.
0

This paper addresses an estimation problem of an additive functional of ϕ, which is defined as θ(P;ϕ)=∑_i=1^kϕ(p_i), given n i.i.d. random samples drawn from a discrete distribution P=(p_1,...,p_k) with alphabet size k. We have revealed in the previous paper that the minimax optimal rate of this problem is characterized by the divergence speed of the fourth derivative of ϕ in a range of fast divergence speed. In this paper, we prove this fact for a more general range of the divergence speed. As a result, we show the minimax optimal rate of the additive functional estimation for each range of the parameter α of the divergence speed. For α∈ (1,3/2), we show that the minimax rate is 1/n+k^2/(n n)^2α. Besides, we show that the minimax rate is 1/n for α∈ [3/2,2].

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset