An Elementary Proof of a Classical Information-Theoretic Formula

09/05/2018
by   Xianming Liu, et al.
0

A renowned information-theoretic formula by Shannon expresses the mutual information rate of a white Gaussian channel with a stationary Gaussian input as an integral of a simple function of the power spectral density of the channel input. We give in this paper a rigorous yet elementary proof of this classical formula. As opposed to all the conventional approaches, which either rely on heavy mathematical machineries or have to resort to some "external" results, our proof, which hinges on a recently proven sampling theorem, is elementary and self-contained, only using some well-known facts from basic calculus and matrix theory.

READ FULL TEXT
research
12/24/2019

A note on the elementary HDX construction of Kaufman-Oppenheim

In this note, we give a self-contained and elementary proof of the eleme...
research
08/15/2022

An Elementary Proof of the Generalization of the Binet Formula for k-bonacci Numbers

We present an elementary proof of the generalization of the k-bonacci Bi...
research
12/02/2020

Information Theory in Density Destructors

Density destructors are differentiable and invertible transforms that ma...
research
09/02/2021

An Information-Theoretic View of Stochastic Localization

Given a probability measure μ over ℝ^n, it is often useful to approximat...
research
01/30/2019

A New Proof of Nonsignalling Multiprover Parallel Repetition Theorem

We present an information theoretic proof of the nonsignalling multiprov...
research
07/12/2017

Revisiting Elementary Denotational Semantics

Operational semantics have been enormously successful, in large part due...

Please sign up or login with your details

Forgot password? Click here to reset