Understanding Individual Neuron Importance Using Information Theory

04/18/2018
by   Kairen Liu, et al.
0

In this work, we characterize the outputs of individual neurons in a trained feed-forward neural network by entropy, mutual information with the class variable, and a class selectivity measure based on Kullback-Leibler divergence. By cumulatively ablating neurons in the network, we connect these information-theoretic measures to the impact their removal has on classification performance on the test set. We observe that, looking at the neural network as a whole, none of these measures is a good indicator for classification performance, thus confirming recent results by Morcos et al. However, looking at specific layers separately, both mutual information and class selectivity are positively correlated with classification performance. We thus conclude that it is ill-advised to compare these measures across layers, and that different layers may be most appropriately characterized by different measures. We then discuss pruning neurons from neural networks to reduce computational complexity of inference. Drawing from our results, we perform pruning based on information-theoretic measures on a fully connected feed-forward neural network with two hidden layers trained on MNIST dataset and compare the results to a recently proposed pruning method. We furthermore show that the common practice of re-training after pruning can partly be obviated by a surgery step called bias balancing, without incurring significant performance degradation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/01/2022

Higher-order mutual information reveals synergistic sub-networks for multi-neuron importance

Quantifying which neurons are important with respect to the classificati...
research
07/01/2020

Information Theoretic Sample Complexity Lower Bound for Feed-Forward Fully-Connected Deep Networks

In this paper, we study the sample complexity lower bound of a d-layer f...
research
03/18/2020

MINT: Deep Network Compression via Mutual Information-based Neuron Trimming

Most approaches to deep neural network compression via pruning either ev...
research
01/06/2020

Investigation and Analysis of Hyper and Hypo neuron pruning to selectively update neurons during Unsupervised Adaptation

Unseen or out-of-domain data can seriously degrade the performance of a ...
research
07/10/2011

Information-Theoretic Measures for Objective Evaluation of Classifications

This work presents a systematic study of objective evaluations of abstai...
research
04/17/2019

Analysing Neural Network Topologies: a Game Theoretic Approach

Artificial Neural Networks have shown impressive success in very differe...
research
09/21/2022

Partial Information Decomposition Reveals the Structure of Neural Representations

In neural networks, task-relevant information is represented jointly by ...

Please sign up or login with your details

Forgot password? Click here to reset