RoS-KD: A Robust Stochastic Knowledge Distillation Approach for Noisy Medical Imaging

by   Ajay Jaiswal, et al.

AI-powered Medical Imaging has recently achieved enormous attention due to its ability to provide fast-paced healthcare diagnoses. However, it usually suffers from a lack of high-quality datasets due to high annotation cost, inter-observer variability, human annotator error, and errors in computer-generated labels. Deep learning models trained on noisy labelled datasets are sensitive to the noise type and lead to less generalization on the unseen samples. To address this challenge, we propose a Robust Stochastic Knowledge Distillation (RoS-KD) framework which mimics the notion of learning a topic from multiple sources to ensure deterrence in learning noisy information. More specifically, RoS-KD learns a smooth, well-informed, and robust student manifold by distilling knowledge from multiple teachers trained on overlapping subsets of training data. Our extensive experiments on popular medical imaging classification tasks (cardiopulmonary disease and lesion classification) using real-world datasets, show the performance benefit of RoS-KD, its ability to distill knowledge from many popular large networks (ResNet-50, DenseNet-121, MobileNet-V2) in a comparatively small network, and its robustness to adversarial attacks (PGD, FSGM). More specifically, RoS-KD achieves >2 improvement on F1-score for lesion classification and cardiopulmonary disease classification tasks, respectively, when the underlying student is ResNet-18 against recent competitive knowledge distillation baseline. Additionally, on cardiopulmonary disease classification task, RoS-KD outperforms most of the SOTA baselines by  1


MED-TEX: Transferring and Explaining Knowledge with Less Data from Pretrained Medical Imaging Models

Deep neural network based image classification methods usually require a...

On the Efficiency of Subclass Knowledge Distillation in Classification Tasks

This work introduces a novel knowledge distillation framework for classi...

Subclass Knowledge Distillation with Known Subclass Labels

This work introduces a novel knowledge distillation framework for classi...

AI can evolve without labels: self-evolving vision transformer for chest X-ray diagnosis through knowledge distillation

Although deep learning-based computer-aided diagnosis systems have recen...

Improving Generalization and Robustness with Noisy Collaboration in Knowledge Distillation

Inspired by trial-to-trial variability in the brain that can result from...

Improving Medical Annotation Quality to Decrease Labeling Burden Using Stratified Noisy Cross-Validation

As machine learning has become increasingly applied to medical imaging d...

Hilbert Distillation for Cross-Dimensionality Networks

3D convolutional neural networks have revealed superior performance in p...

Please sign up or login with your details

Forgot password? Click here to reset