HiCLRE: A Hierarchical Contrastive Learning Framework for Distantly Supervised Relation Extraction

Dongyang Li, Taolin Zhang, Nan Hu, Chengyu Wang, Xiaofeng He


Abstract
Distant supervision assumes that any sentence containing the same entity pairs reflects identical relationships. Previous works of distantly supervised relation extraction (DSRE) task generally focus on sentence-level or bag-level de-noising techniques independently, neglecting the explicit interaction with cross levels. In this paper, we propose a hierarchical contrastive learning Framework for Distantly Supervised relation extraction (HiCLRE) to reduce noisy sentences, which integrate the global structural information and local fine-grained interaction. Specifically, we propose a three-level hierarchical learning framework to interact with cross levels, generating the de-noising context-aware representations via adapting the existing multi-head self-attention, named Multi-Granularity Recontextualization. Meanwhile, pseudo positive samples are also provided in the specific level for contrastive learning via a dynamic gradient-based data augmentation strategy, named Dynamic Gradient Adversarial Perturbation. Experiments demonstrate that HiCLRE significantly outperforms strong baselines in various mainstream DSRE datasets.
Anthology ID:
2022.findings-acl.202
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2567–2578
Language:
URL:
https://aclanthology.org/2022.findings-acl.202
DOI:
10.18653/v1/2022.findings-acl.202
Bibkey:
Cite (ACL):
Dongyang Li, Taolin Zhang, Nan Hu, Chengyu Wang, and Xiaofeng He. 2022. HiCLRE: A Hierarchical Contrastive Learning Framework for Distantly Supervised Relation Extraction. In Findings of the Association for Computational Linguistics: ACL 2022, pages 2567–2578, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
HiCLRE: A Hierarchical Contrastive Learning Framework for Distantly Supervised Relation Extraction (Li et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.findings-acl.202.pdf
Code
 matnlp/hiclre