Monitoring Model Deterioration with Explainable Uncertainty Estimation via Non-parametric Bootstrap

01/27/2022
by   Carlos Mougan, et al.
20

Monitoring machine learning models once they are deployed is challenging. It is even more challenging to decide when to retrain models in real-case scenarios when labeled data is beyond reach, and monitoring performance metrics becomes unfeasible. In this work, we use non-parametric bootstrapped uncertainty estimates and SHAP values to provide explainable uncertainty estimation as a technique that aims to monitor the deterioration of machine learning models in deployment environments, as well as determine the source of model deterioration when target labels are not available. Classical methods are purely aimed at detecting distribution shift, which can lead to false positives in the sense that the model has not deteriorated despite a shift in the data distribution. To estimate model uncertainty we construct prediction intervals using a novel bootstrap method, which improves upon the work of Kumar Srivastava (2012). We show that both our model deterioration detection system as well as our uncertainty estimation method achieve better performance than the current state-of-the-art. Finally, we use explainable AI techniques to gain an understanding of the drivers of model deterioration. We release an open source Python package, doubt, which implements our proposed methods, as well as the code used to reproduce our experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/14/2023

Explanation Shift: Investigating Interactions between Models and Shifting Data Distributions

As input data distributions evolve, the predictive performance of machin...
research
12/19/2019

PySS3: A Python package implementing a novel text classifier with visualization tools for Explainable AI

A recently introduced text classifier, called SS3, has obtained state-of...
research
04/28/2022

Linearity Characterization and Uncertainty Quantification of Spectroradiometers via Maximum Likelihood and the Non-parametric Bootstrap

A rigorous uncertainty quantification for "flux-addition," (also known a...
research
12/01/2021

Decomposing Representations for Deterministic Uncertainty Estimation

Uncertainty estimation is a key component in any deployed machine learni...
research
03/03/2023

Rule-based Out-Of-Distribution Detection

Out-of-distribution detection is one of the most critical issue in the d...
research
03/10/2023

Bootstrap Consistency for the Mack Bootstrap

Mack's distribution-free chain ladder reserving model belongs to the mos...
research
05/05/2022

Uncertainty-Based Non-Parametric Active Peak Detection

Active, non-parametric peak detection is considered. As a use case, acti...

Please sign up or login with your details

Forgot password? Click here to reset