Bayesian Deep Net GLM and GLMM

05/25/2018
by   Minh-Ngoc Tran, et al.
0

Deep feedforward neural networks (DFNNs) are a powerful tool for functional approximation. We describe flexible versions of generalized linear and generalized linear mixed models incorporating basis functions formed by a DFNN. The consideration of neural networks with random effects is not widely used in the literature, perhaps because of the computational challenges of incorporating subject specific parameters into already complex models. Efficient computational methods for high-dimensional Bayesian inference are developed using Gaussian variational approximation, with a parsimonious but flexible factor parametrization of the covariance matrix. We implement natural gradient methods for the optimization, exploiting the factor structure of the variational covariance matrix in computation of the natural gradient. Our flexible DFNN models and Bayesian inference approach lead to a regression and classification method that has a high prediction accuracy, and is able to quantify the prediction uncertainty in a principled and convenient way. We also describe how to perform variable selection in our deep learning method. The proposed methods are illustrated in a wide range of simulated and real-data examples, and the results compare favourably to a state of the art flexible regression and classification method in the statistical literature, the Bayesian additive regression trees (BART) method. User-friendly software packages in Matlab, R and Python implementing the proposed methods are available at https://github.com/VBayesLab

READ FULL TEXT
research
03/15/2019

Joint Mean-Covariance Estimation via the Horseshoe with an Application in Genomic Data Analysis

Seemingly unrelated regression is a natural framework for regressing mul...
research
03/05/2020

Exploiting disagreement between high-dimensional variable selectors for uncertainty visualization

We propose Combined Selection and Uncertainty Visualizer (CSUV), which e...
research
06/28/2021

Flexible Variational Bayes based on a Copula of a Mixture of Normals

Variational Bayes methods approximate the posterior density by a family ...
research
09/25/2019

bamlss: A Lego Toolbox for Flexible Bayesian Regression (and Beyond)

Over the last decades, the challenges in applied regression and in predi...
research
02/21/2023

Bayesian Inference for Evidence Accumulation Models with Regressors

Evidence accumulation models (EAMs) are an important class of cognitive ...
research
10/01/2021

TyXe: Pyro-based Bayesian neural nets for Pytorch

We introduce TyXe, a Bayesian neural network library built on top of Pyt...
research
05/18/2016

Gaussian variational approximation with sparse precision matrices

We consider the problem of learning a Gaussian variational approximation...

Please sign up or login with your details

Forgot password? Click here to reset