A MAP approach for ℓ_q-norm regularized sparse parameter estimation using the EM algorithm

08/05/2015
by   Rodrigo Carvajal, et al.
0

In this paper, Bayesian parameter estimation through the consideration of the Maximum A Posteriori (MAP) criterion is revisited under the prism of the Expectation-Maximization (EM) algorithm. By incorporating a sparsity-promoting penalty term in the cost function of the estimation problem through the use of an appropriate prior distribution, we show how the EM algorithm can be used to efficiently solve the corresponding optimization problem. To this end, we rely on variance-mean Gaussian mixtures (VMGM) to describe the prior distribution, while we incorporate many nice features of these mixtures to our estimation problem. The corresponding MAP estimation problem is completely expressed in terms of the EM algorithm, which allows for handling nonlinearities and hidden variables that cannot be easily handled with traditional methods. For comparison purposes, we also develop a Coordinate Descent algorithm for the ℓ_q-norm penalized problem and present the performance results via simulations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/12/2019

Regularized Estimation and Feature Selection in Mixtures of Gaussian-Gated Experts Models

Mixtures-of-Experts models and their maximum likelihood estimation (MLE)...
research
03/16/2019

Bayesian and Spline based Approaches for (EM based) Graphon Estimation

The paper proposes the estimation of a graphon function for network data...
research
10/29/2018

Regularized Maximum Likelihood Estimation and Feature Selection in Mixtures-of-Experts Models

Mixture of Experts (MoE) are successful models for modeling heterogeneou...
research
01/22/2016

Rectified Gaussian Scale Mixtures and the Sparse Non-Negative Least Squares Problem

In this paper, we develop a Bayesian evidence maximization framework to ...
research
03/13/2023

Maximum a Posteriori Estimation in Graphical Models Using Local Linear Approximation

Sparse structure learning in high-dimensional Gaussian graphical models ...
research
01/20/2020

A Monte Carlo EM Algorithm for the Parameter Estimation of Aggregated Hawkes Processes

A key difficulty that arises from real event data is imprecision in the ...
research
01/24/2022

Decentralized EM to Learn Gaussian Mixtures from Datasets Distributed by Features

Expectation Maximization (EM) is the standard method to learn Gaussian m...

Please sign up or login with your details

Forgot password? Click here to reset