NeuralScale: Efficient Scaling of Neurons for Resource-Constrained Deep Neural Networks

06/23/2020
by   Eugene Lee, et al.
17

Deciding the amount of neurons during the design of a deep neural network to maximize performance is not intuitive. In this work, we attempt to search for the neuron (filter) configuration of a fixed network architecture that maximizes accuracy. Using iterative pruning methods as a proxy, we parameterize the change of the neuron (filter) number of each layer with respect to the change in parameters, allowing us to efficiently scale an architecture across arbitrary sizes. We also introduce architecture descent which iteratively refines the parameterized function used for model scaling. The combination of both proposed methods is coined as NeuralScale. To prove the efficiency of NeuralScale in terms of parameters, we show empirical simulations on VGG11, MobileNetV2 and ResNet18 using CIFAR10, CIFAR100 and TinyImageNet as benchmark datasets. Our results show an increase in accuracy of 3.04 for VGG11, MobileNetV2 and ResNet18 on CIFAR10, CIFAR100 and TinyImageNet respectively under a parameter-constrained setting (output neurons (filters) of default configuration with scaling factor of 0.25).

READ FULL TEXT

page 6

page 12

page 14

research
07/21/2017

Neuron Pruning for Compressing Deep Networks using Maxout Architectures

This paper presents an efficient and robust approach for reducing the si...
research
10/25/2020

Neuron Merging: Compensating for Pruned Neurons

Network pruning is widely used to lighten and accelerate neural network ...
research
09/04/2018

Metabolize Neural Network

The metabolism of cells is the most basic and important part of human fu...
research
04/13/2022

Receding Neuron Importances for Structured Pruning

Structured pruning efficiently compresses networks by identifying and re...
research
04/25/2021

Vector Neurons: A General Framework for SO(3)-Equivariant Networks

Invariance and equivariance to the rotation group have been widely discu...
research
10/03/2016

Adaptive Neuron Apoptosis for Accelerating Deep Learning on Large Scale Systems

We present novel techniques to accelerate the convergence of Deep Learni...
research
12/06/2019

Visualizing Deep Neural Networks for Speech Recognition with Learned Topographic Filter Maps

The uninformative ordering of artificial neurons in Deep Neural Networks...

Please sign up or login with your details

Forgot password? Click here to reset