S2ynRE: Two-stage Self-training with Synthetic data for Low-resource Relation Extraction

Benfeng Xu, Quan Wang, Yajuan Lyu, Dai Dai, Yongdong Zhang, Zhendong Mao


Abstract
Current relation extraction methods suffer from the inadequacy of large-scale annotated data. While distant supervision alleviates the problem of data quantities, there still exists domain disparity in data qualities due to its reliance on domain-restrained knowledge bases. In this work, we propose S2ynRE, a framework of two-stage Self-training with Synthetic data for Relation Extraction.We first leverage the capability of large language models to adapt to the target domain and automatically synthesize large quantities of coherent, realistic training data. We then propose an accompanied two-stage self-training algorithm that iteratively and alternately learns from synthetic and golden data together. We conduct comprehensive experiments and detailed ablations on popular relation extraction datasets to demonstrate the effectiveness of the proposed framework.
Anthology ID:
2023.acl-long.455
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8186–8207
Language:
URL:
https://aclanthology.org/2023.acl-long.455
DOI:
10.18653/v1/2023.acl-long.455
Bibkey:
Cite (ACL):
Benfeng Xu, Quan Wang, Yajuan Lyu, Dai Dai, Yongdong Zhang, and Zhendong Mao. 2023. S2ynRE: Two-stage Self-training with Synthetic data for Low-resource Relation Extraction. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 8186–8207, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
S2ynRE: Two-stage Self-training with Synthetic data for Low-resource Relation Extraction (Xu et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2023.acl-long.455.pdf
Video:
 https://preview.aclanthology.org/add_acl24_videos/2023.acl-long.455.mp4