A Rigorous Theory of Conditional Mean Embeddings

by   Ilja Klebanov, et al.

Conditional mean embeddings (CME) have proven themselves to be a powerful tool in many machine learning applications. They allow the efficient conditioning of probability distributions within the corresponding reproducing kernel Hilbert spaces (RKHSs) by providing a linear-algebraic relation for the kernel mean embeddings of the respective probability distributions. Both centered and uncentered covariance operators have been used to define CMEs in the existing literature. In this paper, we develop a mathematically rigorous theory for both variants, discuss the merits and problems of either, and significantly weaken the conditions for applicability of CMEs. In the course of this, we demonstrate a beautiful connection to Gaussian conditioning in Hilbert spaces.


page 1

page 2

page 3

page 4


Quantum Mean Embedding of Probability Distributions

The kernel mean embedding of probability distributions is commonly used ...

Sobolev Norm Learning Rates for Conditional Mean Embeddings

We develop novel learning rates for conditional mean embeddings by apply...

Recursive Estimation of Conditional Kernel Mean Embeddings

Kernel mean embeddings, a widely used technique in machine learning, map...

Singular Value Decomposition of Operators on Reproducing Kernel Hilbert Spaces

Reproducing kernel Hilbert spaces (RKHSs) play an important role in many...

Nyström Kernel Mean Embeddings

Kernel mean embeddings are a powerful tool to represent probability dist...

Kernel Biclustering algorithm in Hilbert Spaces

Biclustering algorithms partition data and covariates simultaneously, pr...

Algebraic Geometric Comparison of Probability Distributions

We propose a novel algebraic framework for treating probability distribu...

Please sign up or login with your details

Forgot password? Click here to reset