When Fourth Moments Are Enough

11/20/2017
by   Chris Jennings-Shaffer, et al.
0

This note concerns a somewhat innocent question motivated by an observation concerning the use of Chebyshev bounds on sample estimates of p in the binomial distribution with parameters n,p. Namely, what moment order produces the best Chebyshev estimate of p? If S_n(p) has a binomial distribution with parameters n,p, there it is readily observed that argmax_0< p< 1 ES_n^2(p) = argmax_0< p< 1np(1-p) = 1/2, and ES_n^2(1/2) = n/4. Rabi Bhattacharya observed that while the second moment Chebyshev sample size for a 95% confidence estimate within ± 5 percentage points is n = 2000, the fourth moment yields the substantially reduced polling requirement of n = 775. Why stop at fourth moment? Is the argmax achieved at p = 1/2 for higher order moments and, if so, does it help, and compute ES_n^2m(1/2)? As captured by the title of this note, answers to these questions lead to a simple rule of thumb for best choice of moments in terms of an effective sample size for Chebyshev concentration inequalities.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro