A category theory framework for Bayesian learning

by   Kotaro Kamiya, et al.

Inspired by the foundational works by Spivak and Fong and Cruttwell et al., we introduce a categorical framework to formalize Bayesian inference and learning. The two key ideas at play here are the notions of Bayesian inversions and the functor GL as constructed by Cruttwell et al.. In this context, we find that Bayesian learning is the simplest case of the learning paradigm. We then obtain categorical formulations of batch and sequential Bayes updates while also verifying that the two coincide in a specific example.


page 1

page 2

page 3

page 4


Fiducial Inference and Decision Theory

The majority of the statisticians concluded many decades ago that fiduci...

Branching Time Active Inference with Bayesian Filtering

Branching Time Active Inference (Champion et al., 2021b,a) is a framewor...

Bayesian Generalized Network Design

We study network coordination problems, as captured by the setting of ge...

Approximate Inference in Structured Instances with Noisy Categorical Observations

We study the problem of recovering the latent ground truth labeling of a...

A Categorical Framework of General Intelligence

Can machines think? Since Alan Turing asked this question in 1950, nobod...

Please sign up or login with your details

Forgot password? Click here to reset