Neural Temporal Relation Extraction
Dmitriy Dligach, Timothy Miller, Chen Lin, Steven Bethard, Guergana Savova
Abstract
We experiment with neural architectures for temporal relation extraction and establish a new state-of-the-art for several scenarios. We find that neural models with only tokens as input outperform state-of-the-art hand-engineered feature-based models, that convolutional neural networks outperform LSTM models, and that encoding relation arguments with XML tags outperforms a traditional position-based encoding.- Anthology ID:
- E17-2118
- Volume:
- Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers
- Month:
- April
- Year:
- 2017
- Address:
- Valencia, Spain
- Venue:
- EACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 746–751
- Language:
- URL:
- https://aclanthology.org/E17-2118
- DOI:
- Cite (ACL):
- Dmitriy Dligach, Timothy Miller, Chen Lin, Steven Bethard, and Guergana Savova. 2017. Neural Temporal Relation Extraction. In Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers, pages 746–751, Valencia, Spain. Association for Computational Linguistics.
- Cite (Informal):
- Neural Temporal Relation Extraction (Dligach et al., EACL 2017)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/E17-2118.pdf