Conditional expectation using compactification operators

by   Suddhasattwa Das, et al.

The separate tasks of denoising, conditional expectation and manifold learning can often be posed in a common setting of finding the conditional expectations arising from a product of two random variables. This paper focuses on this more general problem and describes an operator theoretic approach to estimating the conditional expectation. Kernel integral operators are used as a compactification tool, to set up the estimation problem as a linear inverse problem in a reproducing kernel Hilbert space. This equation is shown to have solutions that are stable to numerical approximation, thus guaranteeing the convergence of data-driven implementations. The overall technique is easy to implement, and their successful application to some real-world problems are also shown.


The linear conditional expectation in Hilbert space

The linear conditional expectation (LCE) provides a best linear (or rath...

Nonparametric approximation of conditional expectation operators

Given the joint distribution of two random variables X,Y on some second ...

Noise Contrastive Meta-Learning for Conditional Density Estimation using Kernel Mean Embeddings

Current meta-learning approaches focus on learning functional representa...

Optimal Rates for Regularized Conditional Mean Embedding Learning

We address the consistency of a kernel ridge regression estimate of the ...

Deep Learning with Kernels through RKHM and the Perron-Frobenius Operator

Reproducing kernel Hilbert C^*-module (RKHM) is a generalization of repr...

Singular Value Decomposition of Operators on Reproducing Kernel Hilbert Spaces

Reproducing kernel Hilbert spaces (RKHSs) play an important role in many...

A General Derivative Identity for the Conditional Expectation with Focus on the Exponential Family

Consider a pair of random vectors (𝐗,𝐘) and the conditional expectation ...

Please sign up or login with your details

Forgot password? Click here to reset