Investigating transformers in the decomposition of polygonal shapes as point collections

08/17/2021
by   Andrea Alfieri, et al.
0

Transformers can generate predictions in two approaches: 1. auto-regressively by conditioning each sequence element on the previous ones, or 2. directly produce an output sequences in parallel. While research has mostly explored upon this difference on sequential tasks in NLP, we study the difference between auto-regressive and parallel prediction on visual set prediction tasks, and in particular on polygonal shapes in images because polygons are representative of numerous types of objects, such as buildings or obstacles for aerial vehicles. This is challenging for deep learning architectures as a polygon can consist of a varying carnality of points. We provide evidence on the importance of natural orders for Transformers, and show the benefit of decomposing complex polygons into collections of points in an auto-regressive manner.

READ FULL TEXT
research
09/15/2021

Pose Transformers (POTR): Human Motion Prediction with Non-Autoregressive Transformers

We propose to leverage Transformer architectures for non-autoregressive ...
research
11/05/2020

Training Transformers for Information Security Tasks: A Case Study on Malicious URL Prediction

Machine Learning (ML) for information security (InfoSec) utilizes distin...
research
03/19/2022

CNNs and Transformers Perceive Hybrid Images Similar to Humans

Hybrid images is a technique to generate images with two interpretations...
research
05/19/2022

Masked Image Modeling with Denoising Contrast

Since the development of self-supervised visual representation learning ...
research
06/11/2021

Going Beyond Linear Transformers with Recurrent Fast Weight Programmers

Transformers with linearised attention ("linear Transformers") have demo...
research
04/28/2022

A Probabilistic Interpretation of Transformers

We propose a probabilistic interpretation of exponential dot product att...
research
03/25/2022

Unsupervised Learning of Temporal Abstractions with Slot-based Transformers

The discovery of reusable sub-routines simplifies decision-making and pl...

Please sign up or login with your details

Forgot password? Click here to reset