MATE: Multi-view Attention for Table Transformer Efficiency

This work presents a sparse-attention Transformer architecture for modeling documents that contain large tables. Tables are ubiquitous on the web, and are rich in information. However, more than 20 have 20 or more rows (Cafarella et al., 2008), and these large tables present a challenge for current Transformer models, which are typically limited to 512 tokens. Here we propose MATE, a novel Transformer architecture designed to model the structure of web tables. MATE uses sparse attention in a way that allows heads to efficiently attend to either rows or columns in a table. This architecture scales linearly with respect to speed and memory, and can handle documents containing more than 8000 tokens with current accelerators. MATE also has a more appropriate inductive bias for tabular data, and sets a new state-of-the-art for three table reasoning datasets. For HybridQA (Chen et al., 2020b), a dataset that involves large documents containing tables, we improve the best prior result by 19 points.


page 1

page 2

page 3

page 4


Tables to LaTeX: structure and content extraction from scientific tables

Scientific documents contain tables that list important information in a...

Optimized Table Tokenization for Table Structure Recognition

Extracting tables from documents is a crucial task in any document conve...

Extracting Tables from Documents using Conditional Generative Adversarial Networks and Genetic Algorithms

Extracting information from tables in documents presents a significant c...

Understanding tables with intermediate pre-training

Table entailment, the binary classification task of finding if a sentenc...

Accessible tables in digital documents

Accessibility of tables on websites for Visually Impaired Persons (VIP) ...

VeriTable: Fast Equivalence Verification of Multiple Large Forwarding Tables

Due to network practices such as traffic engineering and multi-homing, t...

Please sign up or login with your details

Forgot password? Click here to reset