Model Fusion with Kullback–Leibler Divergence

07/13/2020
by   Sebastian Claici, et al.
59

We propose a method to fuse posterior distributions learned from heterogeneous datasets. Our algorithm relies on a mean field assumption for both the fused model and the individual dataset posteriors and proceeds using a simple assign-and-average approach. The components of the dataset posteriors are assigned to the proposed global model components by solving a regularized variant of the assignment problem. The global components are then updated based on these assignments by their mean under a KL divergence. For exponential family variational distributions, our formulation leads to an efficient non-parametric algorithm for computing the fused model. Our algorithm is easy to describe and implement, efficient, and competitive with state-of-the-art on motion capture analysis, topic modeling, and federated learning of Bayesian neural networks.

READ FULL TEXT

page 1

page 12

research
12/06/2020

Probabilistic Federated Learning of Neural Networks Incorporated with Global Posterior Information

In federated learning, models trained on local clients are distilled int...
research
05/13/2019

Variational approximations using Fisher divergence

Modern applications of Bayesian inference involve models that are suffic...
research
11/19/2022

Personalized Federated Learning with Hidden Information on Personalized Prior

Federated learning (FL for simplification) is a distributed machine lear...
research
02/10/2020

Try Depth Instead of Weight Correlations: Mean-field is a Less Restrictive Assumption for Deeper Networks

We challenge the longstanding assumption that the mean-field approximati...
research
02/12/2019

Gaussian Mean Field Regularizes by Limiting Learned Information

Variational inference with a factorized Gaussian posterior estimate is a...
research
11/15/2022

Bayesian Federated Neural Matching that Completes Full Information

Federated learning is a contemporary machine learning paradigm where loc...
research
06/29/2020

Statistical Foundation of Variational Bayes Neural Networks

Despite the popularism of Bayesian neural networks in recent years, its ...

Please sign up or login with your details

Forgot password? Click here to reset