SARN: Relational Reasoning through Sequential Attention

11/01/2018
by   Jinwon An, et al.
12

This paper proposes an attention module augmented relational network called SARN(Sequential Attention Relational Network) that can carry out relational reasoning by extracting reference objects and making efficient pairing between objects. SARN greatly reduces the computational and memory requirements of the relational network, which computes all object pairs. It also shows high accuracy on the Sort-of-CLEVR dataset compared to other models, especially on relational questions.

READ FULL TEXT

page 4

page 8

page 9

research
11/01/2018

Dilated DenseNets for Relational Reasoning

Despite their impressive performance in many tasks, deep neural networks...
research
10/11/2019

R-SQAIR: Relational Sequential Attend, Infer, Repeat

Traditional sequential multi-object attention models rely on a recurrent...
research
04/14/2022

Optimal quadratic binding for relational reasoning in vector symbolic neural architectures

Binding operation is fundamental to many cognitive processes, such as co...
research
06/05/2018

Relational recurrent neural networks

Memory-based neural networks model temporal data by leveraging an abilit...
research
08/22/2021

Relational Embedding for Few-Shot Classification

We propose to address the problem of few-shot classification by meta-lea...
research
04/11/2019

Relational Graph Attention Networks

We investigate Relational Graph Attention Networks, a class of models th...
research
08/03/2019

Searching for Ambiguous Objects in Videos using Relational Referring Expressions

Humans frequently use referring (identifying) expressions to refer to ob...

Please sign up or login with your details

Forgot password? Click here to reset