RCL: Relation Contrastive Learning for Zero-Shot Relation Extraction

Shusen Wang, Bosen Zhang, Yajing Xu, Yanan Wu, Bo Xiao


Abstract
Zero-shot relation extraction aims to identify novel relations which cannot be observed at the training stage. However, it still faces some challenges since the unseen relations of instances are similar or the input sentences have similar entities, the unseen relation representations from different categories tend to overlap and lead to errors. In this paper, we propose a novel Relation Contrastive Learning framework (RCL) to mitigate above two types of similar problems: Similar Relations and Similar Entities. By jointly optimizing a contrastive instance loss with a relation classification loss on seen relations, RCL can learn subtle difference between instances and achieve better separation between different relation categories in the representation space simultaneously. Especially in contrastive instance learning, the dropout noise as data augmentation is adopted to amplify the semantic difference between similar instances without breaking relation representation, so as to promote model to learn more effective representations. Experiments conducted on two well-known datasets show that RCL can significantly outperform previous state-of-the-art methods. Moreover, if the seen relations are insufficient, RCL can also obtain comparable results with the model trained on the full training set, showing the robustness of our approach.
Anthology ID:
2022.findings-naacl.188
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2456–2468
Language:
URL:
https://aclanthology.org/2022.findings-naacl.188
DOI:
10.18653/v1/2022.findings-naacl.188
Bibkey:
Cite (ACL):
Shusen Wang, Bosen Zhang, Yajing Xu, Yanan Wu, and Bo Xiao. 2022. RCL: Relation Contrastive Learning for Zero-Shot Relation Extraction. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 2456–2468, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
RCL: Relation Contrastive Learning for Zero-Shot Relation Extraction (Wang et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2022.findings-naacl.188.pdf
Video:
 https://preview.aclanthology.org/landing_page/2022.findings-naacl.188.mp4
Code
 shusenwang/naacl2022-rcl
Data
FewRel