Reducing the Variance of Variational Estimates of Mutual Information by Limiting the Critic's Hypothesis Space to RKHS

by   P Aditya Sreekar, et al.

Mutual information (MI) is an information-theoretic measure of dependency between two random variables. Several methods to estimate MI, from samples of two random variables with unknown underlying probability distributions have been proposed in the literature. Recent methods realize parametric probability distributions or critic as a neural network to approximate unknown density ratios. The approximated density ratios are used to estimate different variational lower bounds of MI. While these methods provide reliable estimation when the true MI is low, they produce high variance estimates in cases of high MI. We argue that the high variance characteristic is due to the uncontrolled complexity of the critic's hypothesis space. In support of this argument, we use the data-driven Rademacher complexity of the hypothesis space associated with the critic's architecture to analyse generalization error bound of variational lower bound estimates of MI. In the proposed work, we show that it is possible to negate the high variance characteristics of these estimators by constraining the critic's hypothesis space to Reproducing Hilbert Kernel Space (RKHS), which corresponds to a kernel learned using Automated Spectral Kernel Learning (ASKL). By analysing the aforementioned generalization error bounds, we augment the overall optimisation objective with effective regularisation term. We empirically demonstrate the efficacy of this regularization in enforcing proper bias variance tradeoff on four variational lower bounds, namely NWJ, MINE, JS and SMILE.


page 1

page 2

page 3

page 4


DEMI: Discriminative Estimator of Mutual Information

Estimating mutual information between continuous random variables is oft...

On Variational Bounds of Mutual Information

Estimating and optimizing Mutual Information (MI) is core to many proble...

Reliable Estimation of KL Divergence using a Discriminator in Reproducing Kernel Hilbert Space

Estimating Kullback Leibler (KL) divergence from samples of two distribu...

Computing Functions of Random Variables via Reproducing Kernel Hilbert Space Representations

We describe a method to perform functional operations on probability dis...

Information Theoretic Structured Generative Modeling

Rényi's information provides a theoretical foundation for tractable and ...

Neural Methods for Point-wise Dependency Estimation

Since its inception, the neural estimation of mutual information (MI) ha...

Understanding the Limitations of Variational Mutual Information Estimators

Variational approaches based on neural networks are showing promise for ...

Please sign up or login with your details

Forgot password? Click here to reset