Squeeze aggregated excitation network

08/25/2023
by   Mahendran N, et al.
0

Convolutional neural networks have spatial representations which read patterns in the vision tasks. Squeeze and excitation links the channel wise representations by explicitly modeling on channel level. Multi layer perceptrons learn global representations and in most of the models it is used often at the end after all convolutional layers to gather all the information learned before classification. We propose a method of inducing the global representations within channels to have better performance of the model. We propose SaEnet, Squeeze aggregated excitation network, for learning global channelwise representation in between layers. The proposed module takes advantage of passing important information after squeeze by having aggregated excitation before regaining its shape. We also introduce a new idea of having a multibranch linear(dense) layer in the network. This learns global representations from the condensed information which enhances the representational power of the network. The proposed module have undergone extensive experiments by using Imagenet and CIFAR100 datasets and compared with closely related architectures. The analyzes results that proposed models outputs are comparable and in some cases better than existing state of the art architectures.

READ FULL TEXT
research
12/10/2021

A Discriminative Channel Diversification Network for Image Classification

Channel attention mechanisms in convolutional neural networks have been ...
research
05/28/2019

RecNets: Channel-wise Recurrent Convolutional Neural Networks

In this paper, we introduce Channel-wise recurrent convolutional neural ...
research
04/11/2023

Variations of Squeeze and Excitation networks

Convolutional neural networks learns spatial features and are heavily in...
research
05/23/2023

Efficient Multi-Scale Attention Module with Cross-Spatial Learning

Remarkable effectiveness of the channel or spatial attention mechanisms ...
research
11/29/2018

Global Second-order Pooling Neural Networks

Deep Convolutional Networks (ConvNets) are fundamental to, besides large...
research
12/29/2022

BiMLP: Compact Binary Architectures for Vision Multi-Layer Perceptrons

This paper studies the problem of designing compact binary architectures...
research
06/17/2021

ShuffleBlock: Shuffle to Regularize Deep Convolutional Neural Networks

Deep neural networks have enormous representational power which leads th...

Please sign up or login with your details

Forgot password? Click here to reset