Linguistically-Informed Self-Attention for Semantic Role Labeling

04/23/2018
by   Emma Strubell, et al.
0

The current state-of-the-art end-to-end semantic role labeling (SRL) model is a deep neural network architecture with no explicit linguistic features. However, prior work has shown that gold syntax trees can dramatically improve SRL, suggesting that neural network models could see great improvements from explicit modeling of syntax. In this work, we present linguistically-informed self-attention (LISA): a new neural network model that combines multi-head self-attention with multi-task learning across dependency parsing, part-of-speech, predicate detection and SRL. For example, syntax is incorporated by training one of the attention heads to attend to syntactic parents for each token. Our model can predict all of the above tasks, but it is also trained such that if a high-quality syntactic parse is already available, it can be beneficially injected at test time without re-training our SRL model. In experiments on the CoNLL-2005 SRL dataset LISA achieves an increase of 2.5 F1 absolute over the previous state-of-the-art on newswire with predicted predicates and more than 2.0 F1 on out-of-domain data. On ConLL-2012 English SRL we also show an improvement of more than 3.0 F1, a 13

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/12/2018

Syntax Helps ELMo Understand Semantics: Is Syntax Still Relevant in a Deep Neural Architecture for SRL?

Do unsupervised methods for learning rich, contextualized token represen...
research
10/24/2019

Syntax-Enhanced Self-Attention-Based Semantic Role Labeling

As a fundamental NLP task, semantic role labeling (SRL) aims to discover...
research
09/05/2019

Multi-Granularity Self-Attention for Neural Machine Translation

Current state-of-the-art neural machine translation (NMT) uses a deep mu...
research
12/05/2017

Deep Semantic Role Labeling with Self-Attention

Semantic Role Labeling (SRL) is believed to be a crucial step towards na...
research
01/10/2017

A Simple and Accurate Syntax-Agnostic Neural Model for Dependency-based Semantic Role Labeling

We introduce a simple and accurate neural model for dependency-based sem...
research
12/21/2020

Encoding Syntactic Knowledge in Transformer Encoder for Intent Detection and Slot Filling

We propose a novel Transformer encoder-based architecture with syntactic...
research
10/24/2019

Promoting the Knowledge of Source Syntax in Transformer NMT Is Not Needed

The utility of linguistic annotation in neural machine translation seeme...

Please sign up or login with your details

Forgot password? Click here to reset