Self-explaining variational posterior distributions for Gaussian Process models

09/08/2021
by   Sarem Seitz, et al.
0

Bayesian methods have become a popular way to incorporate prior knowledge and a notion of uncertainty into machine learning models. At the same time, the complexity of modern machine learning makes it challenging to comprehend a model's reasoning process, let alone express specific prior assumptions in a rigorous manner. While primarily interested in the former issue, recent developments intransparent machine learning could also broaden the range of prior information that we can provide to complex Bayesian models. Inspired by the idea of self-explaining models, we introduce a corresponding concept for variational GaussianProcesses. On the one hand, our contribution improves transparency for these types of models. More importantly though, our proposed self-explaining variational posterior distribution allows to incorporate both general prior knowledge about a target function as a whole and prior knowledge about the contribution of individual features.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/21/2023

Machine Learning and the Future of Bayesian Computation

Bayesian models are a powerful tool for studying complex data, allowing ...
research
01/23/2019

Deep Mean Functions for Meta-Learning in Gaussian Processes

Fitting machine learning models in the low-data limit is challenging. Th...
research
02/06/2019

Regularizing Generative Models Using Knowledge of Feature Dependence

Generative modeling is a fundamental problem in machine learning with ma...
research
06/05/2019

Combining Physics-Based Domain Knowledge and Machine Learning using Variational Gaussian Processes with Explicit Linear Prior

Centuries of development in natural sciences and mathematical modeling p...
research
10/28/2022

A Novel Sparse Bayesian Learning and Its Application to Fault Diagnosis for Multistation Assembly Systems

This paper addresses the problem of fault diagnosis in multistation asse...
research
09/17/2020

Kohn-Sham equations as regularizer: building prior knowledge into machine-learned physics

Including prior knowledge is important for effective machine learning mo...
research
03/13/2020

Dynamic transformation of prior knowledge intoBayesian models for data streams

We consider how to effectively use prior knowledge when learning a Bayes...

Please sign up or login with your details

Forgot password? Click here to reset