Abstract
Jointly extracting entity pairs and their relations is challenging when working on distantly-supervised data with ambiguous or noisy labels. To mitigate such impact, we propose uncertainty-aware bootstrap learning, which is motivated by the intuition that the higher uncertainty of an instance, the more likely the model confidence is inconsistent with the ground truths. Specifically, we first explore instance-level data uncertainty to create an initial high-confident examples. Such subset serves as filtering noisy instances and facilitating the model to converge fast at the early stage. During bootstrap learning, we propose self-ensembling as a regularizer to alleviate inter-model uncertainty produced by noisy labels. We further define probability variance of joint tagging probabilities to estimate inner-model parametric uncertainty, which is used to select and build up new reliable training instances for the next iteration. Experimental results on two large datasets reveal that our approach outperforms existing strong baselines and related methods.- Anthology ID:
- 2023.acl-short.116
- Volume:
- Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1349–1358
- Language:
- URL:
- https://aclanthology.org/2023.acl-short.116
- DOI:
- 10.18653/v1/2023.acl-short.116
- Cite (ACL):
- Yufei Li, Xiao Yu, Yanchi Liu, Haifeng Chen, and Cong Liu. 2023. Uncertainty-Aware Bootstrap Learning for Joint Extraction on Distantly-Supervised Data. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 1349–1358, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Uncertainty-Aware Bootstrap Learning for Joint Extraction on Distantly-Supervised Data (Li et al., ACL 2023)
- PDF:
- https://preview.aclanthology.org/add_acl24_videos/2023.acl-short.116.pdf