DeepAI AI Chat
Log In Sign Up

The Unstoppable Rise of Computational Linguistics in Deep Learning

05/13/2020
by   James Henderson, et al.
Idiap Research Institute
0

In this paper, we trace the history of neural networks applied to natural language understanding tasks, and identify key contributions which the nature of language has made to the development of neural network architectures. We focus on the importance of variable binding and its instantiation in attention-based models, and argue that Transformer is not a sequence model but an induced-structure model. This perspective leads to predictions of the challenges facing research in deep learning architectures for natural language understanding.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/01/2019

Natural Language Understanding with the Quora Question Pairs Dataset

This paper explores the task Natural Language Understanding (NLU) by loo...
12/17/2018

Multi-task learning to improve natural language understanding

Recently advancements in sequence-to-sequence neural network architectur...
12/01/2021

Towards More Robust Natural Language Understanding

Natural Language Understanding (NLU) is a branch of Natural Language Pro...
05/03/2018

Fast and Scalable Expansion of Natural Language Understanding Functionality for Intelligent Agents

Fast expansion of natural language functionality of intelligent virtual ...
08/12/2018

Interpreting Recurrent and Attention-Based Neural Models: a Case Study on Natural Language Inference

Deep learning models have achieved remarkable success in natural languag...
06/21/2022

Why Robust Natural Language Understanding is a Challenge

With the proliferation of Deep Machine Learning into real-life applicati...
03/07/2022

HyperMixer: An MLP-based Green AI Alternative to Transformers

Transformer-based architectures are the model of choice for natural lang...