Asymptotic Optimality of Mixture Rules for Detecting Changes in General Stochastic Models

07/24/2018
by   Alexander G. Tartakovsky, et al.
0

The paper addresses a sequential changepoint detection problem for a general stochastic model, assuming that the observed data may be non-i.i.d. (i.e., dependent and non-identically distributed) and the prior distribution of the change point is arbitrary. Tartakovsky and Veeravalli (2005), Baron and Tartakovsky (2006), and, more recently, Tartakovsky (2017) developed a general asymptotic theory of changepoint detection for non-i.i.d. stochastic models, assuming the certain stability of the log-likelihood ratio process, in the case of simple hypotheses when both pre-change and post-change models are completely specified. However, in most applications, the post-change distribution is not completely known. In the present paper, we generalize previous results to the case of parametric uncertainty, assuming the parameter of the post-change distribution is unknown. We introduce two detection rules based on mixtures -- the Mixture Shiryaev rule and the Mixture Shiryaev--Roberts rule -- and study their asymptotic properties in the Bayesian context. In particular, we provide sufficient conditions under which these rules are first-order asymptotically optimal, minimizing moments of the delay to detection as the probability of false alarm approaches zero.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset