Evasion and Hardening of Tree Ensemble Classifiers

09/25/2015
by   Alex Kantchelian, et al.
0

Classifier evasion consists in finding for a given instance x the nearest instance x' such that the classifier predictions of x and x' are different. We present two novel algorithms for systematically computing evasions for tree ensembles such as boosted trees and random forests. Our first algorithm uses a Mixed Integer Linear Program solver and finds the optimal evading instance under an expressive set of constraints. Our second algorithm trades off optimality for speed by using symbolic prediction, a novel algorithm for fast finite differences on tree ensembles. On a digit recognition task, we demonstrate that both gradient boosted trees and random forests are extremely susceptible to evasions. Finally, we harden a boosted tree model without loss of predictive accuracy by augmenting the training set of each boosting round with evading instances, a technique we call adversarial boosting.

READ FULL TEXT
research
06/30/2016

Vote-boosting ensembles

Vote-boosting is a sequential ensemble learning method in which individu...
research
06/04/2017

InfiniteBoost: building infinite ensembles with gradient descent

In machine learning ensemble methods have demonstrated high accuracy for...
research
10/12/2009

Node harvest

When choosing a suitable technique for regression and classification wit...
research
05/15/2023

Fast Inference of Tree Ensembles on ARM Devices

With the ongoing integration of Machine Learning models into everyday li...
research
09/16/2017

Relevant Ensemble of Trees

Tree ensembles are flexible predictive models that can capture relevant ...
research
05/30/2017

Optimization of Tree Ensembles

Tree ensemble models such as random forests and boosted trees are among ...
research
06/20/2017

Interpretable Predictions of Tree-based Ensembles via Actionable Feature Tweaking

Machine-learned models are often described as "black boxes". In many rea...

Please sign up or login with your details

Forgot password? Click here to reset