Syntax Aware LSTM Model for Chinese Semantic Role Labeling

04/03/2017
by   Feng Qian, et al.
0

As for semantic role labeling (SRL) task, when it comes to utilizing parsing information, both traditional methods and recent recurrent neural network (RNN) based methods use the feature engineering way. In this paper, we propose Syntax Aware Long Short Time Memory(SA-LSTM). The structure of SA-LSTM modifies according to dependency parsing information in order to model parsing information directly in an architecture engineering way instead of feature engineering way. We experimentally demonstrate that SA-LSTM gains more improvement from the model architecture. Furthermore, SA-LSTM outperforms the state-of-the-art on CPB 1.0 significantly according to Student t-test (p<0.05).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/12/2019

Syntax-aware Neural Semantic Role Labeling with Supertags

We introduce a new syntax-aware model for dependency-based semantic role...
research
04/07/2016

Geometric Scene Parsing with Hierarchical LSTM

This paper addresses the problem of geometric scene parsing, i.e. simult...
research
02/22/2017

Feature Generation for Robust Semantic Role Labeling

Hand-engineered feature sets are a well understood method for creating r...
research
05/06/2016

LSTM with Working Memory

Previous RNN architectures have largely been superseded by LSTM, or "Lon...
research
03/23/2016

Semantic Object Parsing with Graph LSTM

By taking the semantic object parsing task as an exemplar application sc...
research
01/07/2016

Leveraging Sentence-level Information with Encoder LSTM for Semantic Slot Filling

Recurrent Neural Network (RNN) and one of its specific architectures, Lo...
research
08/05/2019

Semantic Role Labeling with Associated Memory Network

Semantic role labeling (SRL) is a task to recognize all the predicate-ar...

Please sign up or login with your details

Forgot password? Click here to reset