On Feature Interactions Identified by Shapley Values of Binary Classification Games

01/12/2020
by   Sandhya Tripathi, et al.
1

For feature selection and related problems, we introduce the notion of classification game, a cooperative game, with features as players and hinge loss based characteristic function and relate a feature's contribution to Shapley value based error apportioning (SVEA) of total training error. Our major contribution is () to show that for any dataset the threshold 0 on SVEA value identifies feature subset whose joint interactions for label prediction is significant or those features that span a subspace where the data is predominantly lying. In addition, our scheme () identifies the features on which Bayes classifier doesn't depend but any surrogate loss function based finite sample classifier does; this contributes to the excess 0-1 risk of such a classifier, () estimates unknown true hinge risk of a feature, and () relate the stability property of an allocation and negative valued SVEA by designing the analogue of core of classification game. Due to Shapley value's computationally expensive nature, we build on a known Monte Carlo based approximation algorithm that computes characteristic function (Linear Programs) only when needed. We address the potential sample bias problem in feature selection by providing interval estimates for SVEA values obtained from multiple sub-samples. We illustrate all the above aspects on various synthetic and real datasets and show that our scheme achieves better results than existing recursive feature elimination technique and ReliefF in most cases. Our theoretically grounded classification game in terms of well defined characteristic function offers interpretability and explainability of our framework, including identification of important features.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/27/2020

RENT – Repeated Elastic Net Technique for Feature Selection

In this study we present the RENT feature selection method for binary cl...
research
08/27/2020

Feature Selection from High-Dimensional Data with Very Low Sample Size: A Cautionary Tale

In classification problems, the purpose of feature selection is to ident...
research
06/11/2022

Feature Selection using e-values

In the context of supervised parametric models, we introduce the concept...
research
06/17/2015

Feature Selection for Ridge Regression with Provable Guarantees

We introduce single-set spectral sparsification as a deterministic sampl...
research
10/22/2019

Orthogonal variance decomposition based feature selection

Existing feature selection methods fail to properly account for interact...
research
07/29/2020

Fibonacci and k-Subsecting Recursive Feature Elimination

Feature selection is a data mining task with the potential of speeding u...
research
07/11/2012

Pre-Selection of Independent Binary Features: An Application to Diagnosing Scrapie in Sheep

Suppose that the only available information in a multi-class problem are...

Please sign up or login with your details

Forgot password? Click here to reset