DeepAI AI Chat
Log In Sign Up

Scaling of Model Approximation Errors and Expected Entropy Distances

07/14/2012
by   Guido F. Montúfar, et al.
Max Planck Society
Penn State University
0

We compute the expected value of the Kullback-Leibler divergence to various fundamental statistical models with respect to canonical priors on the probability simplex. We obtain closed formulas for the expected model approximation errors, depending on the dimension of the models and the cardinalities of their sample spaces. For the uniform prior, the expected divergence from any model containing the uniform distribution is bounded by a constant 1-γ, and for the models that we consider, this bound is approached if the state space is very large and the models' dimension does not grow too fast. For Dirichlet priors the expected divergence is bounded in a similar way, if the concentration parameters take reasonable values. These results serve as reference values for more complicated statistical models.

READ FULL TEXT

page 1

page 2

page 3

page 4

03/31/2021

α-Geodesical Skew Divergence

The asymmetric skew divergence smooths one of the distributions by mixin...
03/12/2020

Statistical and Topological Properties of Sliced Probability Divergences

The idea of slicing divergences has been proven to be successful when co...
05/12/2019

Theoretical Limits of One-Shot Distributed Learning

We consider a distributed system of m machines and a server. Each machin...
01/06/2023

Divergence vs. Decision P-values: A Distinction Worth Making in Theory and Keeping in Practice

There are two distinct definitions of 'P-value' for evaluating a propose...
01/02/2018

A Concentration Result of Estimating Phi-Divergence using Data Dependent Partition

Estimation of the ϕ-divergence between two unknown probability distribut...
05/24/2020

The Optimal 'AND'

The joint distribution P(X,Y) cannot be determined from its marginals P(...