Neural Ranking Models for Temporal Dependency Structure Parsing

Yuchen Zhang, Nianwen Xue


Abstract
We design and build the first neural temporal dependency parser. It utilizes a neural ranking model with minimal feature engineering, and parses time expressions and events in a text into a temporal dependency tree structure. We evaluate our parser on two domains: news reports and narrative stories. In a parsing-only evaluation setup where gold time expressions and events are provided, our parser reaches 0.81 and 0.70 f-score on unlabeled and labeled parsing respectively, a result that is very competitive against alternative approaches. In an end-to-end evaluation setup where time expressions and events are automatically recognized, our parser beats two strong baselines on both data domains. Our experimental results and discussions shed light on the nature of temporal dependency structures in different domains and provide insights that we believe will be valuable to future research in this area.
Anthology ID:
D18-1371
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
3339–3349
Language:
URL:
https://aclanthology.org/D18-1371
DOI:
10.18653/v1/D18-1371
Bibkey:
Cite (ACL):
Yuchen Zhang and Nianwen Xue. 2018. Neural Ranking Models for Temporal Dependency Structure Parsing. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 3339–3349, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Neural Ranking Models for Temporal Dependency Structure Parsing (Zhang & Xue, EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/D18-1371.pdf
Code
 yuchenz/tdp_ranking +  additional community code