Temporal Information Extraction for Question Answering Using Syntactic Dependencies in an LSTM-based Architecture

Yuanliang Meng, Anna Rumshisky, Alexey Romanov


Abstract
In this paper, we propose to use a set of simple, uniform in architecture LSTM-based models to recover different kinds of temporal relations from text. Using the shortest dependency path between entities as input, the same architecture is used to extract intra-sentence, cross-sentence, and document creation time relations. A “double-checking” technique reverses entity pairs in classification, boosting the recall of positive cases and reducing misclassifications between opposite classes. An efficient pruning algorithm resolves conflicts globally. Evaluated on QA-TempEval (SemEval2015 Task 5), our proposed technique outperforms state-of-the-art methods by a large margin. We also conduct intrinsic evaluation and post state-of-the-art results on Timebank-Dense.
Anthology ID:
D17-1092
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
887–896
Language:
URL:
https://aclanthology.org/D17-1092
DOI:
10.18653/v1/D17-1092
Bibkey:
Cite (ACL):
Yuanliang Meng, Anna Rumshisky, and Alexey Romanov. 2017. Temporal Information Extraction for Question Answering Using Syntactic Dependencies in an LSTM-based Architecture. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 887–896, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Temporal Information Extraction for Question Answering Using Syntactic Dependencies in an LSTM-based Architecture (Meng et al., EMNLP 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/D17-1092.pdf