DeepAI AI Chat
Log In Sign Up

A Practical Unified Notation for Information-Theoretic Quantities in ML

by   Andreas Kirsch, et al.

Information theory is of importance to machine learning, but the notation for information-theoretic quantities is sometimes opaque. The right notation can convey valuable intuitions and concisely express new ideas. We propose such a notation for machine learning users and expand it to include information-theoretic quantities between events (outcomes) and random variables. We apply this notation to a popular information-theoretic acquisition function in Bayesian active learning which selects the most informative (unlabelled) samples to be labelled by an expert. We demonstrate the value of our notation when extending the acquisition function to the core-set problem, which consists of selecting the most informative samples given the labels.


page 1

page 2

page 3

page 4


Unifying Approaches in Data Subset Selection via Fisher Information and Information-Theoretic Quantities

The mutual information between predictions and model parameters – also r...

A unified information-theoretic model of EEG signatures of human language processing

We advance an information-theoretic model of human language processing i...

An Information-Theoretic Framework for Unifying Active Learning Problems

This paper presents an information-theoretic framework for unifying acti...

Information-Theoretic Methods for Identifying Relationships among Climate Variables

Information-theoretic quantities, such as entropy, are used to quantify ...

Prioritized training on points that are learnable, worth learning, and not yet learned

We introduce Goldilocks Selection, a technique for faster model training...

The Value of Interaction in Data Intelligence

In human computer interaction (HCI), it is common to evaluate the value ...