Data Synthesis for Testing Black-Box Machine Learning Models

11/03/2021
by   Diptikalyan Saha, et al.
0

The increasing usage of machine learning models raises the question of the reliability of these models. The current practice of testing with limited data is often insufficient. In this paper, we provide a framework for automated test data synthesis to test black-box ML/DL models. We address an important challenge of generating realistic user-controllable data with model agnostic coverage criteria to test a varied set of properties, essentially to increase trust in machine learning models. We experimentally demonstrate the effectiveness of our technique.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/27/2020

Testing Monotonicity of Machine Learning Models

Today, machine learning (ML) models are increasingly applied in decision...
research
06/22/2017

MAGIX: Model Agnostic Globally Interpretable Explanations

Explaining the behavior of a black box machine learning model at the ins...
research
09/03/2020

Fairness in the Eyes of the Data: Certifying Machine-Learning Models

We present a framework that allows to certify the fairness degree of a m...
research
02/11/2021

Testing Framework for Black-box AI Models

With widespread adoption of AI models for important decision making, ens...
research
08/03/2017

A glass-box interactive machine learning approach for solving NP-hard problems with the human-in-the-loop

The goal of Machine Learning to automatically learn from data, extract k...
research
04/11/2021

ALT-MAS: A Data-Efficient Framework for Active Testing of Machine Learning Algorithms

Machine learning models are being used extensively in many important are...

Please sign up or login with your details

Forgot password? Click here to reset