Connection Reduction Is All You Need

08/02/2022
by   Rui-Yang Ju, et al.
0

Convolutional Neural Networks (CNN) increase depth by stacking convolutional layers, and deeper network models perform better in image recognition. Empirical research shows that simply stacking convolutional layers does not make the network train better, and skip connection (residual learning) can improve network model performance. For the image classification task, models with global densely connected architectures perform well in large datasets like ImageNet, but are not suitable for small datasets such as CIFAR-10 and SVHN. Different from dense connections, we propose two new algorithms to connect layers. Baseline is a densely connected network, and the networks connected by the two new algorithms are named ShortNet1 and ShortNet2 respectively. The experimental results of image classification on CIFAR-10 and SVHN show that ShortNet1 has a 5 Baseline. ShortNet2 speeds up inference time by 40 accuracy.

READ FULL TEXT
research
04/21/2018

Study of Residual Networks for Image Recognition

Deep neural networks demonstrate to have a high performance on image cla...
research
01/18/2018

Sparsely Connected Convolutional Networks

Residual learning with skip connections permits training ultra-deep neur...
research
02/24/2018

Convolutional Neural Networks combined with Runge-Kutta Methods

A convolutional neural network for image classification can be construct...
research
07/29/2021

Densely connected neural networks for nonlinear regression

Densely connected convolutional networks (DenseNet) behave well in image...
research
04/30/2018

Towards Deeper Generative Architectures for GANs using Dense connections

In this paper, we present the result of adopting skip connections and de...
research
10/29/2019

Best Practices for Convolutional Neural Networks Applied to Object Recognition in Images

This research project studies the impact of convolutional neural network...
research
07/25/2023

Exploring the Sharpened Cosine Similarity

Convolutional layers have long served as the primary workhorse for image...

Please sign up or login with your details

Forgot password? Click here to reset