Few Clean Instances Help Denoising Distant Supervision

Yufang Liu, Ziyin Huang, Yijun Wang, Changzhi Sun, Man Lan, Yuanbin Wu, Xiaofeng Mou, Ding Wang


Abstract
Existing distantly supervised relation extractors usually rely on noisy data for both model training and evaluation, which may lead to garbage-in-garbage-out systems. To alleviate the problem, we study whether a small clean dataset could help improve the quality of distantly supervised models. We show that besides getting a more convincing evaluation of models, a small clean dataset also helps us to build more robust denoising models. Specifically, we propose a new criterion for clean instance selection based on influence functions. It collects sample-level evidence for recognizing good instances (which is more informative than loss-level evidence). We also propose a teacher-student mechanism for controlling purity of intermediate results when bootstrapping the clean set. The whole approach is model-agnostic and demonstrates strong performances on both denoising real (NYT) and synthetic noisy datasets.
Anthology ID:
2022.coling-1.223
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
2528–2539
Language:
URL:
https://aclanthology.org/2022.coling-1.223
DOI:
Bibkey:
Cite (ACL):
Yufang Liu, Ziyin Huang, Yijun Wang, Changzhi Sun, Man Lan, Yuanbin Wu, Xiaofeng Mou, and Ding Wang. 2022. Few Clean Instances Help Denoising Distant Supervision. In Proceedings of the 29th International Conference on Computational Linguistics, pages 2528–2539, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Few Clean Instances Help Denoising Distant Supervision (Liu et al., COLING 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.coling-1.223.pdf
Code
 airuibadi/if_dsre