Horseshoe Regularization for Machine Learning in Complex and Deep Models

04/24/2019
by   Anindya Bhadra, et al.
22

Since the advent of the horseshoe priors for regularization, global-local shrinkage methods have proved to be a fertile ground for the development of Bayesian methodology in machine learning, specifically for high-dimensional regression and classification problems. They have achieved remarkable success in computation, and enjoy strong theoretical support. Most of the existing literature has focused on the linear Gaussian case; see Bhadra et al. (2019) for a systematic survey. The purpose of the current article is to demonstrate that the horseshoe regularization is useful far more broadly, by reviewing both methodological and computational developments in complex models that are more relevant to machine learning applications. Specifically, we focus on methodological challenges in horseshoe regularization in nonlinear and non-Gaussian models; multivariate models; and deep neural networks. We also outline the recent computational developments in horseshoe shrinkage for complex models along with a list of available software implementations that allows one to venture out beyond the comfort zone of the canonical linear regression problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/24/2017

Nearly optimal Bayesian Shrinkage for High Dimensional Regression

During the past decade, shrinkage priors have received much attention in...
research
08/15/2022

Intuitive Joint Priors for Bayesian Linear Multilevel Models: The R2D2M2 prior

The training of high-dimensional regression models on comparably sparse ...
research
02/17/2019

Bayesian Regularization: From Tikhonov to Horseshoe

Bayesian regularization is a central tool in modern-day statistical and ...
research
04/05/2017

On Generalization and Regularization in Deep Learning

Why do large neural network generalize so well on complex tasks such as ...
research
11/27/2015

Regularized EM Algorithms: A Unified Framework and Statistical Guarantees

Latent variable models are a fundamental modeling tool in machine learni...
research
06/24/2023

A Unified Approach to Controlling Implicit Regularization via Mirror Descent

Inspired by the remarkable success of deep neural networks, there has be...
research
08/21/2017

Deep vs. Diverse Architectures for Classification Problems

This study compares various superlearner and deep learning architectures...

Please sign up or login with your details

Forgot password? Click here to reset