Context-Aware Neural Model for Temporal Information Extraction

Yuanliang Meng, Anna Rumshisky


Abstract
We propose a context-aware neural network model for temporal information extraction. This model has a uniform architecture for event-event, event-timex and timex-timex pairs. A Global Context Layer (GCL), inspired by Neural Turing Machine (NTM), stores processed temporal relations in narrative order, and retrieves them for use when relevant entities come in. Relations are then classified in context. The GCL model has long-term memory and attention mechanisms to resolve irregular long-distance dependencies that regular RNNs such as LSTM cannot recognize. It does not require any new input features, while outperforming the existing models in literature. To our knowledge it is also the first model to use NTM-like architecture to process the information from global context in discourse-scale natural text processing. We are going to release the source code in the future.
Anthology ID:
P18-1049
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Iryna Gurevych, Yusuke Miyao
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
527–536
Language:
URL:
https://aclanthology.org/P18-1049
DOI:
10.18653/v1/P18-1049
Bibkey:
Cite (ACL):
Yuanliang Meng and Anna Rumshisky. 2018. Context-Aware Neural Model for Temporal Information Extraction. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 527–536, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Context-Aware Neural Model for Temporal Information Extraction (Meng & Rumshisky, ACL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-3/P18-1049.pdf
Note:
 P18-1049.Notes.txt
Poster:
 P18-1049.Poster.pdf