Self-training improves Recurrent Neural Networks performance for Temporal Relation Extraction
Chen Lin, Timothy Miller, Dmitriy Dligach, Hadi Amiri, Steven Bethard, Guergana Savova
Abstract
Neural network models are oftentimes restricted by limited labeled instances and resort to advanced architectures and features for cutting edge performance. We propose to build a recurrent neural network with multiple semantically heterogeneous embeddings within a self-training framework. Our framework makes use of labeled, unlabeled, and social media data, operates on basic features, and is scalable and generalizable. With this method, we establish the state-of-the-art result for both in- and cross-domain for a clinical temporal relation extraction task.- Anthology ID:
- W18-5619
- Volume:
- Proceedings of the Ninth International Workshop on Health Text Mining and Information Analysis
- Month:
- October
- Year:
- 2018
- Address:
- Brussels, Belgium
- Editors:
- Alberto Lavelli, Anne-Lyse Minard, Fabio Rinaldi
- Venue:
- Louhi
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 165–176
- Language:
- URL:
- https://aclanthology.org/W18-5619
- DOI:
- 10.18653/v1/W18-5619
- Cite (ACL):
- Chen Lin, Timothy Miller, Dmitriy Dligach, Hadi Amiri, Steven Bethard, and Guergana Savova. 2018. Self-training improves Recurrent Neural Networks performance for Temporal Relation Extraction. In Proceedings of the Ninth International Workshop on Health Text Mining and Information Analysis, pages 165–176, Brussels, Belgium. Association for Computational Linguistics.
- Cite (Informal):
- Self-training improves Recurrent Neural Networks performance for Temporal Relation Extraction (Lin et al., Louhi 2018)
- PDF:
- https://preview.aclanthology.org/improve-issue-templates/W18-5619.pdf
- Data
- MIMIC-III