Exploring Contextualized Neural Language Models for Temporal Dependency Parsing

04/30/2020
by   Hayley Ross, et al.
0

Extracting temporal relations between events and time expressions has many applications such as constructing event timelines and time-related question answering. It is a challenging problem that requires syntactic and semantic information at sentence or discourse levels, which may be captured by deep language models such as BERT (Devlin et al., 2019). In this paper, we developed several variants of BERT-based temporal dependency parser, and show that BERT significantly improves temporal dependency parsing (Zhang and Xue,2018a). Source code and trained models will be made available at github.com.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset