The Common Information of N Dependent Random Variables

10/18/2010
by   Wei Liu, et al.
0

This paper generalizes Wyner's definition of common information of a pair of random variables to that of N random variables. We prove coding theorems that show the same operational meanings for the common information of two random variables generalize to that of N random variables. As a byproduct of our proof, we show that the Gray-Wyner source coding network can be generalized to N source squences with N decoders. We also establish a monotone property of Wyner's common information which is in contrast to other notions of the common information, specifically Shannon's mutual information and Gács and Körner's common randomness. Examples about the computation of Wyner's common information of N random variables are also given.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/10/2013

Wyner's Common Information: Generalizations and A New Lossy Source Coding Interpretation

Wyner's common information was originally defined for a pair of dependen...
research
05/10/2023

Common Information Dimension

The exact common information between a set of random variables X_1,...,X...
research
09/26/2019

Entropic matroids and their representation

This paper investigates entropic matroids, that is, matroids whose rank ...
research
07/21/2023

Topological reconstruction of compact supports of dependent stationary random variables

In this paper we extend results on reconstruction of probabilistic suppo...
research
06/06/2018

Learning Kolmogorov Models for Binary Random Variables

We summarize our recent findings, where we proposed a framework for lear...
research
01/07/2021

A generalization of the Von Neumann extractor

An iterative randomness extraction algorithm which generalized the Von N...
research
06/26/2023

A short proof of the Gács–Körner theorem

We present a short proof of a celebrated result of Gács and Körner givin...

Please sign up or login with your details

Forgot password? Click here to reset