JUMP-Means: Small-Variance Asymptotics for Markov Jump Processes

by   Jonathan H. Huggins, et al.

Markov jump processes (MJPs) are used to model a wide range of phenomena from disease progression to RNA path folding. However, maximum likelihood estimation of parametric models leads to degenerate trajectories and inferential performance is poor in nonparametric models. We take a small-variance asymptotics (SVA) approach to overcome these limitations. We derive the small-variance asymptotics for parametric and nonparametric MJPs for both directly observed and hidden state models. In the parametric case we obtain a novel objective function which leads to non-degenerate trajectories. To derive the nonparametric version we introduce the gamma-gamma process, a novel extension to the gamma-exponential process. We propose algorithms for each of these formulations, which we call JUMP-means. Our experiments demonstrate that JUMP-means is competitive with or outperforms widely used MJP inference approaches in terms of both speed and reconstruction accuracy.


page 1

page 2

page 3

page 4


Parameter Estimation for Weak Variance-Alpha-Gamma Processes

The weak variance-alpha-gamma process is a multivariate Lévy process con...

Nonparametric Hawkes Processes: Online Estimation and Generalization Bounds

In this paper, we design a nonparametric online algorithm for estimating...

Nonparametric estimation of the incubation time distribution

We discuss nonparametric estimators of the distribution of the incubatio...

MAD-Bayes: MAP-based Asymptotic Derivations from Bayes

The classical mixture of Gaussians model is related to K-means via small...

Functionals of nonparametric maximum likelihood estimators

Nonparametric maximum likelihood estimators (MLEs) in inverse problems o...

Nonparametric estimation of continuous DPPs with kernel methods

Determinantal Point Process (DPPs) are statistical models for repulsive ...

Please sign up or login with your details

Forgot password? Click here to reset