The EM algorithm and the Laplace Approximation

01/24/2014
by   Niko Brümmer, et al.
0

The Laplace approximation calls for the computation of second derivatives at the likelihood maximum. When the maximum is found by the EM-algorithm, there is a convenient way to compute these derivatives. The likelihood gradient can be obtained from the EM-auxiliary, while the Hessian can be obtained from this gradient with the Pearlmutter trick.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/26/2023

General adjoint-differentiated Laplace approximation

The hierarchical prior used in Latent Gaussian models (LGMs) induces a p...
research
06/25/2018

Learning dynamical systems with particle stochastic approximation EM

We present the particle stochastic approximation EM (PSAEM) algorithm fo...
research
11/20/2020

Computation and application of generalized linear mixed model derivatives using lme4

Maximum likelihood estimation of generalized linear mixed models(GLMMs) ...
research
09/26/2022

Asymmetric Laplace scale mixtures for the distribution of cryptocurrency returns

Recent studies about cryptocurrency returns show that its distribution c...
research
10/27/2019

f-SAEM: A fast Stochastic Approximation of the EM algorithm for nonlinear mixed effects models

The ability to generate samples of the random effects from their conditi...
research
04/06/2020

ML-EM algorithm with known continuous movement model

In Positron Emission Tomography, movement leads to blurry reconstruction...
research
09/04/2019

The ML-EM algorithm in continuum: sparse measure solutions

Linear inverse problems A μ = δ with Poisson noise and non-negative unkn...

Please sign up or login with your details

Forgot password? Click here to reset