Performance-Agnostic Fusion of Probabilistic Classifier Outputs

09/01/2020
by   Jordan F. Masakuna, et al.
0

We propose a method for combining probabilistic outputs of classifiers to make a single consensus class prediction when no further information about the individual classifiers is available, beyond that they have been trained for the same task. The lack of relevant prior information rules out typical applications of Bayesian or Dempster-Shafer methods, and the default approach here would be methods based on the principle of indifference, such as the sum or product rule, which essentially weight all classifiers equally. In contrast, our approach considers the diversity between the outputs of the various classifiers, iteratively updating predictions based on their correspondence with other predictions until the predictions converge to a consensus decision. The intuition behind this approach is that classifiers trained for the same task should typically exhibit regularities in their outputs on a new task; the predictions of classifiers which differ significantly from those of others are thus given less credence using our approach. The approach implicitly assumes a symmetric loss function, in that the relative cost of various prediction errors are not taken into account. Performance of the model is demonstrated on different benchmark datasets. Our proposed method works well in situations where accuracy is the performance metric; however, it does not output calibrated probabilities, so it is not suitable in situations where such probabilities are required for further processing.

READ FULL TEXT
research
02/11/2021

Sample Efficient Learning of Image-Based Diagnostic Classifiers Using Probabilistic Labels

Deep learning approaches often require huge datasets to achieve good gen...
research
09/15/2023

Performance Metrics for Probabilistic Ordinal Classifiers

Ordinal classification models assign higher penalties to predictions fur...
research
12/04/2020

A novel multi-classifier information fusion based on Dempster-Shafer theory: application to vibration-based fault detection

Achieving a high prediction rate is a crucial task in fault detection. A...
research
02/08/2022

Calibrated Learning to Defer with One-vs-All Classifiers

The learning to defer (L2D) framework has the potential to make AI syste...
research
05/19/2017

Estimating Accuracy from Unlabeled Data: A Probabilistic Logic Approach

We propose an efficient method to estimate the accuracy of classifiers u...
research
03/08/2018

Aggregation using input-output trade-off

In this paper, we introduce a new learning strategy based on a seminal i...
research
06/03/2021

A Normative Model of Classifier Fusion

Combining the outputs of multiple classifiers or experts into a single p...

Please sign up or login with your details

Forgot password? Click here to reset