"Understanding Robustness Lottery": A Comparative Visual Analysis of Neural Network Pruning Approaches

by   Zhimin Li, et al.

Deep learning approaches have provided state-of-the-art performance in many applications by relying on extremely large and heavily overparameterized neural networks. However, such networks have been shown to be very brittle, not generalize well to new uses cases, and are often difficult if not impossible to deploy on resources limited platforms. Model pruning, i.e., reducing the size of the network, is a widely adopted strategy that can lead to more robust and generalizable network – usually orders of magnitude smaller with the same or even improved performance. While there exist many heuristics for model pruning, our understanding of the pruning process remains limited. Empirical studies show that some heuristics improve performance while others can make models more brittle or have other side effects. This work aims to shed light on how different pruning methods alter the network's internal feature representation, and the corresponding impact on model performance. To provide a meaningful comparison and characterization of model feature space, we use three geometric metrics that are decomposed from the common adopted classification loss. With these metrics, we design a visualization system to highlight the impact of pruning on model prediction as well as the latent feature embedding. The proposed tool provides an environment for exploring and studying differences among pruning methods and between pruned and original model. By leveraging our visualization, the ML researchers can not only identify samples that are fragile to model pruning and data corruption but also obtain insights and explanations on how some pruned models achieve superior robustness performance.


page 7

page 9


Exploring the Performance of Pruning Methods in Neural Networks: An Empirical Study of the Lottery Ticket Hypothesis

In this paper, we explore the performance of different pruning methods i...

CFDP: Common Frequency Domain Pruning

As the saying goes, sometimes less is more – and when it comes to neural...

Paoding: Supervised Robustness-preserving Data-free Neural Network Pruning

When deploying pre-trained neural network models in real-world applicati...

Zeroth-Order Topological Insights into Iterative Magnitude Pruning

Modern-day neural networks are famously large, yet also highly redundant...

Pruning in the Face of Adversaries

The vulnerability of deep neural networks against adversarial examples -...

Cluster-based pruning techniques for audio data

Deep learning models have become widely adopted in various domains, but ...

ViNNPruner: Visual Interactive Pruning for Deep Learning

Neural networks grow vastly in size to tackle more sophisticated tasks. ...

Please sign up or login with your details

Forgot password? Click here to reset