Beyond Sparsity: Tree Regularization of Deep Models for Interpretability

11/16/2017
by   Mike Wu, et al.
0

The lack of interpretability remains a key barrier to the adoption of deep models in many applications. In this work, we explicitly regularize deep models so human users might step through the process behind their predictions in little time. Specifically, we train deep time-series models so their class-probability predictions have high accuracy while being closely modeled by decision trees with few nodes. Using intuitive toy examples as well as medical tasks for treating sepsis and HIV, we demonstrate that this new tree regularization yields models that are easier for humans to simulate than simpler L1 or L2 penalties without sacrificing predictive power.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/14/2019

Optimizing for Interpretability in Deep Neural Networks with Tree Regularization

Deep models have advanced prediction in many domains, but their lack of ...
research
08/13/2019

Regional Tree Regularization for Interpretability in Black Box Models

The lack of interpretability remains a barrier to the adoption of deep n...
research
05/26/2023

Improving Stability in Decision Tree Models

Owing to their inherently interpretable structure, decision trees are co...
research
04/10/2019

Enhancing Decision Tree based Interpretation of Deep Neural Networks through L1-Orthogonal Regularization

One obstacle that so far prevents the introduction of machine learning m...
research
06/06/2021

Tabular Data: Deep Learning is Not All You Need

A key element of AutoML systems is setting the types of models that will...
research
10/10/2022

Self-explaining Hierarchical Model for Intraoperative Time Series

Major postoperative complications are devastating to surgical patients. ...
research
07/26/2021

Thought Flow Nets: From Single Predictions to Trains of Model Thought

When humans solve complex problems, they rarely come up with a decision ...

Please sign up or login with your details

Forgot password? Click here to reset