Quantifying contribution and propagation of error from computational steps, algorithms and hyperparameter choices in image classification pipelines

02/21/2019
by   Aritra Chowdhury, et al.
0

Data science relies on pipelines that are organized in the form of interdependent computational steps. Each step consists of various candidate algorithms that maybe used for performing a particular function. Each algorithm consists of several hyperparameters. Algorithms and hyperparameters must be optimized as a whole to produce the best performance. Typical machine learning pipelines consist of complex algorithms in each of the steps. Not only is the selection process combinatorial, but it is also important to interpret and understand the pipelines. We propose a method to quantify the importance of different components in the pipeline, by computing an error contribution relative to an agnostic choice of computational steps, algorithms and hyperparameters. We also propose a methodology to quantify the propagation of error from individual components of the pipeline with the help of a naive set of benchmark algorithms not involved in the pipeline. We demonstrate our methodology on image classification pipelines. The agnostic and naive methodologies quantify the error contribution and propagation respectively from the computational steps, algorithms and hyperparameters in the image classification pipeline. We show that algorithm selection and hyperparameter optimization methods like grid search, random search and Bayesian optimization can be used to quantify the error contribution and propagation, and that random search is able to quantify them more accurately than Bayesian optimization. This methodology can be used by domain experts to understand machine learning and data analysis pipelines in terms of their individual components, which can help in prioritizing different components of the pipeline.

READ FULL TEXT

page 1

page 6

page 9

research
02/25/2019

Quantifying error contributions of computational steps, algorithms and hyperparameter choices in image classification pipelines

Data science relies on pipelines that are organized in the form of inter...
research
07/13/2021

Hyperparameter Optimization: Foundations, Algorithms, Best Practices and Open Challenges

Most machine learning algorithms are configured by one or several hyperp...
research
08/25/2022

A Globally Convergent Gradient-based Bilevel Hyperparameter Optimization Method

Hyperparameter optimization in machine learning is often achieved using ...
research
02/08/2022

Face2PPG: An unsupervised pipeline for blood volume pulse extraction from faces

Photoplethysmography (PPG) signals have become a key technology in many ...
research
08/26/2020

How to tune the RBF SVM hyperparameters?: An empirical evaluation of 18 search algorithms

SVM with an RBF kernel is usually one of the best classification algorit...
research
11/07/2022

Using Set Covering to Generate Databases for Holistic Steganalysis

Within an operational framework, covers used by a steganographer are lik...
research
11/29/2021

Naive Automated Machine Learning

An essential task of Automated Machine Learning (AutoML) is the problem ...

Please sign up or login with your details

Forgot password? Click here to reset