Abstract
A principal barrier to training temporal relation extraction models in new domains is the lack of varied, high quality examples and the challenge of collecting more. We present a method of automatically collecting distantly-supervised examples of temporal relations. We scrape and automatically label event pairs where the temporal relations are made explicit in text, then mask out those explicit cues, forcing a model trained on this data to learn other signals. We demonstrate that a pre-trained Transformer model is able to transfer from the weakly labeled examples to human-annotated benchmarks in both zero-shot and few-shot settings, and that the masking scheme is important in improving generalization.- Anthology ID:
- 2021.adaptnlp-1.20
- Volume:
- Proceedings of the Second Workshop on Domain Adaptation for NLP
- Month:
- April
- Year:
- 2021
- Address:
- Kyiv, Ukraine
- Editors:
- Eyal Ben-David, Shay Cohen, Ryan McDonald, Barbara Plank, Roi Reichart, Guy Rotman, Yftah Ziser
- Venue:
- AdaptNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 195–203
- Language:
- URL:
- https://preview.aclanthology.org/remove-affiliations/2021.adaptnlp-1.20/
- DOI:
- Cite (ACL):
- Xinyu Zhao, Shih-Ting Lin, and Greg Durrett. 2021. Effective Distant Supervision for Temporal Relation Extraction. In Proceedings of the Second Workshop on Domain Adaptation for NLP, pages 195–203, Kyiv, Ukraine. Association for Computational Linguistics.
- Cite (Informal):
- Effective Distant Supervision for Temporal Relation Extraction (Zhao et al., AdaptNLP 2021)
- PDF:
- https://preview.aclanthology.org/remove-affiliations/2021.adaptnlp-1.20.pdf
- Code
- xyz-zy/xdomain-temprel + additional community code