Deep Stochastic Processes via Functional Markov Transition Operators

05/24/2023
by   Jin Xu, et al.
0

We introduce Markov Neural Processes (MNPs), a new class of Stochastic Processes (SPs) which are constructed by stacking sequences of neural parameterised Markov transition operators in function space. We prove that these Markov transition operators can preserve the exchangeability and consistency of SPs. Therefore, the proposed iterative construction adds substantial flexibility and expressivity to the original framework of Neural Processes (NPs) without compromising consistency or adding restrictions. Our experiments demonstrate clear advantages of MNPs over baseline models on a variety of tasks.

READ FULL TEXT
research
04/02/2020

Kernel autocovariance operators of stationary processes: Estimation and convergence

We consider autocovariance operators of a stationary stochastic process ...
research
03/25/2016

Markov substitute processes : a new model for linguistics and beyond

We introduce Markov substitute processes, a new model at the crossroad o...
research
12/18/2018

On a flexible construction of a negative binomial model

This work presents a construction of stationary Markov models with negat...
research
12/15/2020

Stochastic monotonicity and the Markov product for copulas

Given two random variables X and Y, stochastic monotonicity describes a ...
research
01/30/2023

Evidential Decision Theory via Partial Markov Categories

We introduce partial Markov categories. In the same way that Markov cate...
research
06/20/2023

Time-Varying Transition Matrices with Multi-task Gaussian Processes

In this paper, we present a kernel-based, multi-task Gaussian Process (G...
research
04/24/2023

UTSGAN: Unseen Transition Suss GAN for Transition-Aware Image-to-image Translation

In the field of Image-to-Image (I2I) translation, ensuring consistency b...

Please sign up or login with your details

Forgot password? Click here to reset