Attend and Rectify: a Gated Attention Mechanism for Fine-Grained Recovery

07/19/2018
by   Pau Rodríguez, et al.
2

We propose a novel attention mechanism to enhance Convolutional Neural Networks for fine-grained recognition. It learns to attend to lower-level feature activations without requiring part annotations and uses these activations to update and rectify the output likelihood distribution. In contrast to other approaches, the proposed mechanism is modular, architecture-independent and efficient both in terms of parameters and computation required. Experiments show that networks augmented with our approach systematically improve their classification accuracy and become more robust to clutter. As a result, Wide Residual Networks augmented with our proposal surpasses the state of the art classification accuracies in CIFAR-10, the Adience gender recognition task, Stanford dogs, and UEC Food-100.

READ FULL TEXT

page 2

page 8

page 9

page 14

research
07/30/2019

Pay attention to the activations: a modular attention mechanism for fine-grained image recognition

Fine-grained image recognition is central to many multimedia tasks such ...
research
12/11/2018

Reproduction Report on "Learn to Pay Attention"

We have successfully implemented the "Learn to Pay Attention" model of a...
research
04/19/2016

An Attentive Neural Architecture for Fine-grained Entity Type Classification

In this work we propose a novel attention-based neural network model for...
research
11/27/2018

Generating Attention from Classifier Activations for Fine-grained Recognition

Recent advances in fine-grained recognition utilize attention maps to lo...
research
11/17/2019

ELoPE: Fine-Grained Visual Classification with Efficient Localization, Pooling and Embedding

The task of fine-grained visual classification (FGVC) deals with classif...
research
11/26/2021

TDAN: Top-Down Attention Networks for Enhanced Feature Selectivity in CNNs

Attention modules for Convolutional Neural Networks (CNNs) are an effect...
research
02/28/2022

Dynamic N:M Fine-grained Structured Sparse Attention Mechanism

Transformers are becoming the mainstream solutions for various tasks lik...

Please sign up or login with your details

Forgot password? Click here to reset