N-Grammer: Augmenting Transformers with latent n-grams

07/13/2022
by   Aurko Roy, et al.
7

Transformer models have recently emerged as one of the foundational models in natural language processing, and as a byproduct, there is significant recent interest and investment in scaling these models. However, the training and inference costs of these large Transformer language models are prohibitive, thus necessitating more research in identifying more efficient variants. In this work, we propose a simple yet effective modification to the Transformer architecture inspired by the literature in statistical language modeling, by augmenting the model with n-grams that are constructed from a discrete latent representation of the text sequence. We evaluate our model, the N-Grammer on language modeling on the C4 data-set as well as text classification on the SuperGLUE data-set, and find that it outperforms several strong baselines such as the Transformer and the Primer. We open-source our model for reproducibility purposes in Jax.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/17/2021

Primer: Searching for Efficient Transformers for Language Modeling

Large Transformer models have been central to recent advances in natural...
research
03/14/2020

Finnish Language Modeling with Deep Transformer Models

Transformers have recently taken the center stage in language modeling a...
research
07/06/2023

Vision Language Transformers: A Survey

Vision language tasks, such as answering questions about or generating c...
research
01/29/2018

Discrete Autoencoders for Sequence Models

Recurrent models for sequences have been recently successful at many tas...
research
07/17/2023

Retentive Network: A Successor to Transformer for Large Language Models

In this work, we propose Retentive Network (RetNet) as a foundation arch...
research
12/21/2020

RealFormer: Transformer Likes Residual Attention

Transformer is the backbone of modern NLP models. In this paper, we prop...
research
07/15/2023

Transformers are Universal Predictors

We find limits to the Transformer architecture for language modeling and...

Please sign up or login with your details

Forgot password? Click here to reset