Commonsense Knowledge Graph Completion Via Contrastive Pretraining and Node Clustering

Siwei Wu, Xiangqing Shen, Rui Xia


Abstract
The nodes in the commonsense knowledge graph (CSKG) are normally represented by free-form short text (e.g., word or phrase). Different nodes may represent the same concept. This leads to the problems of edge sparsity and node redundancy, which challenges CSKG representation and completion. On the one hand, edge sparsity limits the performance of graph representation learning; On the other hand, node redundancy makes different nodes corresponding to the same concept have inconsistent relations with other nodes. To address the two problems, we propose a new CSKG completion framework based on Contrastive Pretraining and Node Clustering (CPNC). Contrastive Pretraining constructs positive and negative head-tail node pairs on CSKG and utilizes contrastive learning to obtain better semantic node representation. Node Clustering aggregates nodes with the same concept into a latent concept, assisting the task of CSKG completion. We evaluate our CPNC approach on two CSKG completion benchmarks (CN-100K and ATOMIC), where CPNC outperforms the state-of-the-art methods. Extensive experiments demonstrate that both Contrastive Pretraining and Node Clustering can significantly improve the performance of CSKG completion. The source code of CPNC is publicly available on https://github.com/NUSTM/CPNC.
Anthology ID:
2023.findings-acl.878
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13977–13989
Language:
URL:
https://aclanthology.org/2023.findings-acl.878
DOI:
10.18653/v1/2023.findings-acl.878
Bibkey:
Cite (ACL):
Siwei Wu, Xiangqing Shen, and Rui Xia. 2023. Commonsense Knowledge Graph Completion Via Contrastive Pretraining and Node Clustering. In Findings of the Association for Computational Linguistics: ACL 2023, pages 13977–13989, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Commonsense Knowledge Graph Completion Via Contrastive Pretraining and Node Clustering (Wu et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2023.findings-acl.878.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-5/2023.findings-acl.878.mp4