Derivatives of mutual information in Gaussian channels

03/04/2023
by   Minh-Toan Nguyen, et al.
0

We derive a general formula for the derivatives of mutual information between inputs and outputs of multiple Gaussian channels with respect to signal-to-noise ratios of channels. The obtained result displays a remarkable resemblance to the classical cumulant-moment relation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/11/2023

Breakdown of a concavity property of mutual information for non-Gaussian channels

Let S and S̃ be two independent and identically distributed random varia...
research
11/12/2018

Mutual Information of Wireless Channels and Block-Jacobi Ergodic Operators

Shannon's mutual information of a random multiple antenna and multipath ...
research
02/06/2019

Subadditivity Beyond Trees and the Chi-Squared Mutual Information

In 2000, Evans et al. [Eva+00] proved the subadditivity of the mutual in...
research
05/03/2018

Convexity of mutual information along the Ornstein-Uhlenbeck flow

We study the convexity of mutual information as a function of time along...
research
08/28/2019

Linear Noise Approximation of Intensity-Driven Signal Transduction Channels

Biochemical signal transduction, a form of molecular communication, can ...
research
08/22/2021

Bias for the Trace of the Resolvent and Its Application on Non-Gaussian and Non-centered MIMO Channels

The mutual information (MI) of Gaussian multi-input multi-output (MIMO) ...
research
03/04/2019

Database Alignment with Gaussian Features

We consider the problem of aligning a pair of databases with jointly Gau...

Please sign up or login with your details

Forgot password? Click here to reset