DeepAI AI Chat
Log In Sign Up

Scaling of Model Approximation Errors and Expected Entropy Distances

by   Guido F. Montúfar, et al.
Max Planck Society
Penn State University

We compute the expected value of the Kullback-Leibler divergence to various fundamental statistical models with respect to canonical priors on the probability simplex. We obtain closed formulas for the expected model approximation errors, depending on the dimension of the models and the cardinalities of their sample spaces. For the uniform prior, the expected divergence from any model containing the uniform distribution is bounded by a constant 1-γ, and for the models that we consider, this bound is approached if the state space is very large and the models' dimension does not grow too fast. For Dirichlet priors the expected divergence is bounded in a similar way, if the concentration parameters take reasonable values. These results serve as reference values for more complicated statistical models.


page 1

page 2

page 3

page 4


α-Geodesical Skew Divergence

The asymmetric skew divergence smooths one of the distributions by mixin...

Statistical and Topological Properties of Sliced Probability Divergences

The idea of slicing divergences has been proven to be successful when co...

Theoretical Limits of One-Shot Distributed Learning

We consider a distributed system of m machines and a server. Each machin...

Divergence vs. Decision P-values: A Distinction Worth Making in Theory and Keeping in Practice

There are two distinct definitions of 'P-value' for evaluating a propose...

A Concentration Result of Estimating Phi-Divergence using Data Dependent Partition

Estimation of the ϕ-divergence between two unknown probability distribut...

The Optimal 'AND'

The joint distribution P(X,Y) cannot be determined from its marginals P(...