Minimum Stein Discrepancy Estimators

06/19/2019
by   Alessandro Barp, et al.
2

When maximum likelihood estimation is infeasible, one often turns to score matching, contrastive divergence, or minimum probability flow learning to obtain tractable parameter estimates. We provide a unifying perspective of these techniques as minimum Stein discrepancy estimators and use this lens to design new diffusion kernel Stein discrepancy (DKSD) and diffusion score matching (DSM) estimators with complementary strengths. We establish the consistency, asymptotic normality, and robustness of DKSD and DSM estimators, derive stochastic Riemannian gradient descent algorithms for their efficient optimization, and demonstrate their advantages over score matching in models with non-smooth densities or heavy tailed distributions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/21/2021

Interpreting diffusion score matching using normalizing flow

Scoring matching (SM), and its related counterpart, Stein discrepancy (S...
research
08/16/2022

Riemannian Diffusion Models

Diffusion models are recent state-of-the-art methods for image generatio...
research
05/15/2019

Information criteria for non-normalized models

Many statistical models are given in the form of non-normalized densitie...
research
01/23/2019

Unified efficient estimation framework for unnormalized models

Parameter estimation of unnormalized models is a challenging problem bec...
research
05/20/2020

Nonparametric Score Estimators

Estimating the score, i.e., the gradient of log density function, from a...
research
10/28/2022

Minimum Kernel Discrepancy Estimators

For two decades, reproducing kernels and their associated discrepancies ...
research
06/13/2019

Statistical Inference for Generative Models with Maximum Mean Discrepancy

While likelihood-based inference and its variants provide a statisticall...

Please sign up or login with your details

Forgot password? Click here to reset