Abstract
The widespread existence of wrongly labeled instances is a challenge to distantly supervised relation extraction. Most of the previous works are trained in a bag-level setting to alleviate such noise. However, sentence-level training better utilizes the information than bag-level training, as long as combined with effective noise alleviation. In this work, we propose a novel Transitive Instance Weighting mechanism integrated with the self-distilled BERT backbone, utilizing information in the intermediate outputs to generate dynamic instance weights for denoised sentence-level training. By down-weighting wrongly labeled instances and discounting the weights of easy-to-fit ones, our method can effectively tackle wrongly labeled instances and prevent overfitting. Experiments on both held-out and manual datasets indicate that our method achieves state-of-the-art performance and consistent improvements over the baselines.- Anthology ID:
- 2023.findings-emnlp.13
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 168–180
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.13
- DOI:
- 10.18653/v1/2023.findings-emnlp.13
- Cite (ACL):
- Xiangyu Lin, Weijia Jia, and Zhiguo Gong. 2023. Self-distilled Transitive Instance Weighting for Denoised Distantly Supervised Relation Extraction. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 168–180, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Self-distilled Transitive Instance Weighting for Denoised Distantly Supervised Relation Extraction (Lin et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/2023.findings-emnlp.13.pdf