Improving Interpretability in Medical Imaging Diagnosis using Adversarial Training

12/02/2020
by   Andrei Margeloiu, et al.
20

We investigate the influence of adversarial training on the interpretability of convolutional neural networks (CNNs), specifically applied to diagnosing skin cancer. We show that gradient-based saliency maps of adversarially trained CNNs are significantly sharper and more visually coherent than those of standardly trained CNNs. Furthermore, we show that adversarially trained networks highlight regions with significant color variation within the lesion, a common characteristic of melanoma. We find that fine-tuning a robust network with a small learning rate further improves saliency maps' sharpness. Lastly, we provide preliminary work suggesting that robustifying the first layers to extract robust low-level features leads to visually coherent explanations.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 7

page 11

page 12

10/16/2019

Global Saliency: Aggregating Saliency Maps to Assess Dataset Artefact Bias

In high-stakes applications of machine learning models, interpretability...
06/23/2021

Gradient-Based Interpretability Methods and Binarized Neural Networks

Binarized Neural Networks (BNNs) have the potential to revolutionize the...
06/25/2020

Investigating and Exploiting Image Resolution for Transfer Learning-based Skin Lesion Classification

Skin cancer is among the most common cancer types. Dermoscopic image ana...
05/09/2019

Learning Interpretable Features via Adversarially Robust Optimization

Neural networks are proven to be remarkably successful for classificatio...
06/14/2020

On Saliency Maps and Adversarial Robustness

A Very recent trend has emerged to couple the notion of interpretability...

Code Repositories

Interpretability-Adversarial

Code for the NeurIPS 2020 Workshop paper "Improving Interpretability in Medical Imaging Diagnosis using Adversarial Training"


view repo

Please sign up or login with your details

Forgot password? Click here to reset