PASS-GLM: polynomial approximate sufficient statistics for scalable Bayesian GLM inference

09/26/2017
by   Jonathan H. Huggins, et al.
0

Generalized linear models (GLMs) -- such as logistic regression, Poisson regression, and robust regression -- provide interpretable models for diverse data types. Probabilistic approaches, particularly Bayesian ones, allow coherent estimates of uncertainty, incorporation of prior information, and sharing of power across experiments via hierarchical models. In practice, however, the approximate Bayesian methods necessary for inference have either failed to scale to large data sets or failed to provide theoretical guarantees on the quality of inference. We propose a new approach based on constructing polynomial approximate sufficient statistics for GLMs (PASS-GLM). We demonstrate that our method admits a simple algorithm as well as trivial streaming and distributed extensions that do not compound error across computations. We provide theoretical guarantees on the quality of point (MAP) estimates, the approximate posterior, and posterior mean and uncertainty estimates. We validate our approach empirically in the case of logistic regression using a quadratic approximation and show competitive performance with stochastic gradient descent, MCMC, and the Laplace approximation in terms of speed and multiple measures of accuracy -- including on an advertising data set with 40 million data points and 20,000 covariates.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2016

Coresets for Scalable Bayesian Logistic Regression

The use of Bayesian methods in large-scale data settings is attractive b...
research
12/31/2021

Statistical scalability and approximate inference in distributed computing environments

Harnessing distributed computing environments to build scalable inferenc...
research
06/09/2015

Provable Bayesian Inference via Particle Mirror Descent

Bayesian methods are appealing in their flexibility in modeling complex ...
research
10/26/2021

Topologically penalized regression on manifolds

We study a regression problem on a compact manifold M. In order to take ...
research
01/28/2021

Low Complexity Approximate Bayesian Logistic Regression for Sparse Online Learning

Theoretical results show that Bayesian methods can achieve lower bounds ...
research
01/20/2022

Scalable k-d trees for distributed data

Data structures known as k-d trees have numerous applications in scienti...
research
11/18/2015

A New Smooth Approximation to the Zero One Loss with a Probabilistic Interpretation

We examine a new form of smooth approximation to the zero one loss in wh...

Please sign up or login with your details

Forgot password? Click here to reset