Maximally Divergent Intervals for Anomaly Detection

10/21/2016
by   Erik Rodner, et al.
0

We present new methods for batch anomaly detection in multivariate time series. Our methods are based on maximizing the Kullback-Leibler divergence between the data distribution within and outside an interval of the time series. An empirical analysis shows the benefits of our algorithms compared to methods that treat each time step independently from each other without optimizing with respect to all possible intervals.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset