UPANets: Learning from the Universal Pixel Attention Networks

03/15/2021
by   Ching-Hsun Tseng, et al.
11

Among image classification, skip and densely-connection-based networks have dominated most leaderboards. Recently, from the successful development of multi-head attention in natural language processing, it is sure that now is a time of either using a Transformer-like model or hybrid CNNs with attention. However, the former need a tremendous resource to train, and the latter is in the perfect balance in this direction. In this work, to make CNNs handle global and local information, we proposed UPANets, which equips channel-wise attention with a hybrid skip-densely-connection structure. Also, the extreme-connection structure makes UPANets robust with a smoother loss landscape. In experiments, UPANets surpassed most well-known and widely-used SOTAs with an accuracy of 96.47 importantly, these performances have high parameters efficiency and only trained in one customer-based GPU. We share implementing code of UPANets in https://github.com/hanktseng131415go/UPANets.

READ FULL TEXT

page 1

page 4

page 7

page 12

page 13

research
09/09/2021

UCTransNet: Rethinking the Skip Connections in U-Net from a Channel-wise Perspective with Transformer

Most recent semantic segmentation methods adopt a U-Net framework with a...
research
05/28/2022

MDMLP: Image Classification from Scratch on Small Datasets with MLP

The attention mechanism has become a go-to technique for natural languag...
research
05/12/2023

T-former: An Efficient Transformer for Image Inpainting

Benefiting from powerful convolutional neural networks (CNNs), learning-...
research
09/07/2022

Spach Transformer: Spatial and Channel-wise Transformer Based on Local and Global Self-attentions for PET Image Denoising

Position emission tomography (PET) is widely used in clinics and researc...
research
02/13/2022

BViT: Broad Attention based Vision Transformer

Recent works have demonstrated that transformer can achieve promising pe...
research
09/30/2022

Rethinking skip connection model as a learnable Markov chain

Over past few years afterward the birth of ResNet, skip connection has b...
research
12/16/2022

Convolution-enhanced Evolving Attention Networks

Attention-based neural networks, such as Transformers, have become ubiqu...

Please sign up or login with your details

Forgot password? Click here to reset