Exploring Contextualized Neural Language Models for Temporal Dependency Parsing

Hayley Ross, Jonathan Cai, Bonan Min

Extracting temporal relations between events and time expressions has many applications such as constructing event timelines and time-related question answering. It is a challenging problem that requires syntactic and semantic information at sentence or discourse levels, which may be captured by deep language models such as BERT (Devlin et al., 2019). In this paper, we developed several variants of BERT-based temporal dependency parser, and show that BERT significantly improves temporal dependency parsing (Zhang and Xue,2018a). Source code and trained models will be made available at github.com.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment