MARS: Meta-Learning as Score Matching in the Function Space

by   Krunoslav Lehman Pavasovic, et al.
ETH Zurich

Meta-learning aims to extract useful inductive biases from a set of related datasets. In Bayesian meta-learning, this is typically achieved by constructing a prior distribution over neural network parameters. However, specifying families of computationally viable prior distributions over the high-dimensional neural network parameters is difficult. As a result, existing approaches resort to meta-learning restrictive diagonal Gaussian priors, severely limiting their expressiveness and performance. To circumvent these issues, we approach meta-learning through the lens of functional Bayesian neural network inference, which views the prior as a stochastic process and performs inference in the function space. Specifically, we view the meta-training tasks as samples from the data-generating process and formalize meta-learning as empirically estimating the law of this stochastic process. Our approach can seamlessly acquire and represent complex prior knowledge by meta-learning the score function of the data-generating process marginals instead of parameter space priors. In a comprehensive benchmark, we demonstrate that our method achieves state-of-the-art performance in terms of predictive accuracy and substantial improvements in the quality of uncertainty estimates.


page 1

page 2

page 3

page 4


Meta-Learning Reliable Priors in the Function Space

Meta-Learning promises to enable more data-efficient inference by harnes...

Deep Mean Functions for Meta-Learning in Gaussian Processes

Fitting machine learning models in the low-data limit is challenging. Th...

PAC-Bayesian Meta-Learning: From Theory to Practice

Meta-Learning aims to accelerate the learning on new tasks by acquiring ...

PACOH: Bayes-Optimal Meta-Learning with PAC-Guarantees

Meta-learning can successfully acquire useful inductive biases from data...

Amortised Inference in Bayesian Neural Networks

Meta-learning is a framework in which machine learning models train over...

A contrastive rule for meta-learning

Meta-learning algorithms leverage regularities that are present on a set...

CMVAE: Causal Meta VAE for Unsupervised Meta-Learning

Unsupervised meta-learning aims to learn the meta knowledge from unlabel...

Please sign up or login with your details

Forgot password? Click here to reset