Pruning artificial neural networks: a way to find well-generalizing, high-entropy sharp minima

04/30/2020
by   Enzo Tartaglione, et al.
0

Recently, a race towards the simplification of deep networks has begun, showing that it is effectively possible to reduce the size of these models with minimal or no performance loss. However, there is a general lack in understanding why these pruning strategies are effective. In this work, we are going to compare and analyze pruned solutions with two different pruning approaches, one-shot and gradual, showing the higher effectiveness of the latter. In particular, we find that gradual pruning allows access to narrow, well-generalizing minima, which are typically ignored when using one-shot approaches. In this work we also propose PSP-entropy, a measure to understand how a given neuron correlates to some specific learned classes. Interestingly, we observe that the features extracted by iteratively-pruned models are less correlated to specific classes, potentially making these models a better fit in transfer learning approaches.

READ FULL TEXT

page 8

page 10

research
10/16/2021

Neural Network Pruning Through Constrained Reinforcement Learning

Network pruning reduces the size of neural networks by removing (pruning...
research
07/08/2022

Pruning Early Exit Networks

Deep learning models that perform well often have high computational cos...
research
06/28/2020

ESPN: Extremely Sparse Pruned Networks

Deep neural networks are often highly overparameterized, prohibiting the...
research
05/31/2022

ViNNPruner: Visual Interactive Pruning for Deep Learning

Neural networks grow vastly in size to tackle more sophisticated tasks. ...
research
05/25/2022

Train Flat, Then Compress: Sharpness-Aware Minimization Learns More Compressible Models

Model compression by way of parameter pruning, quantization, or distilla...
research
06/07/2022

Recall Distortion in Neural Network Pruning and the Undecayed Pruning Algorithm

Pruning techniques have been successfully used in neural networks to tra...
research
08/01/2023

Understanding Activation Patterns in Artificial Neural Networks by Exploring Stochastic Processes

To gain a deeper understanding of the behavior and learning dynamics of ...

Please sign up or login with your details

Forgot password? Click here to reset