An information scaling law: ζ= 3/4

10/25/2017
by   Michael C. Abbott, et al.
0

Consider the entropy of a unit Gaussian convolved over a discrete set of K points, constrained to an interval of length L. Maximising this entropy fixes K, and we show that this number exhibits a novel scaling law K L^1/ζ as L -> infinity, with exponent ζ = 3/4. This law was observed numerically in a recent paper about optimal effective theories; here we present an analytic derivation. We argue that this law is generic for channel capacity maximisation, or the equivalent minimax problem. We also briefly discuss the behaviour at the boundary of the interval, and higher dimensional versions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro