Expert load matters: operating networks at high accuracy and low manual effort

08/09/2023
by   Sara Sangalli, et al.
0

In human-AI collaboration systems for critical applications, in order to ensure minimal error, users should set an operating point based on model confidence to determine when the decision should be delegated to human experts. Samples for which model confidence is lower than the operating point would be manually analysed by experts to avoid mistakes. Such systems can become truly useful only if they consider two aspects: models should be confident only for samples for which they are accurate, and the number of samples delegated to experts should be minimized. The latter aspect is especially crucial for applications where available expert time is limited and expensive, such as healthcare. The trade-off between the model accuracy and the number of samples delegated to experts can be represented by a curve that is similar to an ROC curve, which we refer to as confidence operating characteristic (COC) curve. In this paper, we argue that deep neural networks should be trained by taking into account both accuracy and expert load and, to that end, propose a new complementary loss function for classification that maximizes the area under this COC curve. This promotes simultaneously the increase in network accuracy and the reduction in number of samples delegated to humans. We perform experiments on multiple computer vision and medical image datasets for classification. Our results demonstrate that the proposed loss improves classification accuracy and delegates less number of decisions to experts, achieves better out-of-distribution samples detection and on par calibration performance compared to existing loss functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/19/2015

Mediated Experts for Deep Convolutional Networks

We present a new supervised architecture termed Mediated Mixture-of-Expe...
research
07/02/2021

Mitigating deep double descent by concatenating inputs

The double descent curve is one of the most intriguing properties of dee...
research
12/20/2022

Calibrating Deep Neural Networks using Explicit Regularisation and Dynamic Data Pruning

Deep neural networks (DNN) are prone to miscalibrated predictions, often...
research
07/13/2021

AUC Optimization for Robust Small-footprint Keyword Spotting with Limited Training Data

Deep neural networks provide effective solutions to small-footprint keyw...
research
02/25/2021

Towards Unbiased and Accurate Deferral to Multiple Experts

Machine learning models are often implemented in cohort with humans in t...
research
10/30/2022

Learning to Defer to Multiple Experts: Consistent Surrogate Losses, Confidence Calibration, and Conformal Ensembles

We study the statistical properties of learning to defer (L2D) to multip...
research
07/08/2021

Collaboration of Experts: Achieving 80 100M FLOPs

In this paper, we propose a Collaboration of Experts (CoE) framework to ...

Please sign up or login with your details

Forgot password? Click here to reset