Bayesian hierarchical stacking

01/22/2021
by   Yuling Yao, et al.
7

Stacking is a widely used model averaging technique that yields asymptotically optimal prediction among all linear averages. We show that stacking is most effective when the model predictive performance is heterogeneous in inputs, so that we can further improve the stacked mixture with a hierarchical model. With the input-varying yet partially-pooled model weights, hierarchical stacking improves average and conditional predictions. Our Bayesian formulation includes constant-weight (complete-pooling) stacking as a special case. We generalize to incorporate discrete and continuous inputs, other structured priors, and time-series and longitudinal data. We demonstrate on several applied problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/10/2020

Hierarchical Gaussian Process Priors for Bayesian Neural Network Weights

Probabilistic neural networks are typically modeled with independent wei...
research
05/12/2023

Locking and Quacking: Stacking Bayesian model predictions by log-pooling and superposition

Combining predictions from different models is a central problem in Baye...
research
09/08/2022

Hierarchical Graph Pooling is an Effective Citywide Traffic Condition Prediction Model

Accurate traffic conditions prediction provides a solid foundation for v...
research
12/14/2021

Local Prediction Pools

We propose local prediction pools as a method for combining the predicti...
research
12/24/2019

Bayesian Aggregation

A general challenge in statistics is prediction in the presence of multi...
research
06/24/2014

Combining predictions from linear models when training and test inputs differ

Methods for combining predictions from different models in a supervised ...

Please sign up or login with your details

Forgot password? Click here to reset