Large Scale Variational Bayesian Inference for Structured Scale Mixture Models

06/27/2012
by   Young Jun Ko, et al.
0

Natural image statistics exhibit hierarchical dependencies across multiple scales. Representing such prior knowledge in non-factorial latent tree models can boost performance of image denoising, inpainting, deconvolution or reconstruction substantially, beyond standard factorial "sparse" methodology. We derive a large scale approximate Bayesian inference algorithm for linear models with non-factorial (latent tree-structured) scale mixture priors. Experimental results on a range of denoising and inpainting problems demonstrate substantially improved performance compared to MAP estimation or to inference with factorial priors.

READ FULL TEXT

page 3

page 7

page 8

research
05/04/2021

Variational Inference and Sparsity in High-Dimensional Deep Gaussian Mixture Models

Gaussian mixture models are a popular tool for model-based clustering, a...
research
03/27/2020

GAN-based Priors for Quantifying Uncertainty

Bayesian inference is used extensively to quantify the uncertainty in an...
research
01/03/2017

Image denoising using group sparsity residual and external nonlocal self-similarity prior

Nonlocal image representation has been successfully used in many image-r...
research
01/18/2021

Deep Universal Blind Image Denoising

Image denoising is an essential part of many image processing and comput...
research
10/06/2008

Large Scale Variational Inference and Experimental Design for Sparse Generalized Linear Models

Many problems of low-level computer vision and image processing, such as...
research
03/08/2021

Bayesian imaging using Plug Play priors: when Langevin meets Tweedie

Since the seminal work of Venkatakrishnan et al. (2013), Plug Play (...
research
10/26/2012

Managing sparsity, time, and quality of inference in topic models

Inference is an integral part of probabilistic topic models, but is ofte...

Please sign up or login with your details

Forgot password? Click here to reset