MICO: A Multi-alternative Contrastive Learning Framework for Commonsense Knowledge Representation

Ying Su, Zihao Wang, Tianqing Fang, Hongming Zhang, Yangqiu Song, Tong Zhang


Abstract
Commonsense reasoning tasks such as commonsense knowledge graph completion and commonsense question answering require powerful representation learning. In this paper, we propose to learn commonsense knowledge representation by MICO, a Multi-alternative contrastIve learning framework on COmmonsense knowledge graphs (MICO). MICO generates the commonsense knowledge representation by contextual interaction between entity nodes and relations with multi-alternative contrastive learning. In MICO, the head and tail entities in an (h,r,t) knowledge triple are converted to two relation-aware sequence pairs (a premise and an alternative) in the form of natural language. Semantic representations generated by MICO can benefit the following two tasks by simply comparing the similarity score between the representations: 1) zero-shot commonsense question answering tasks; 2) inductive commonsense knowledge graph completion tasks. Extensive experiments show the effectiveness of our method.
Anthology ID:
2022.findings-emnlp.96
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1339–1351
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.96
DOI:
10.18653/v1/2022.findings-emnlp.96
Bibkey:
Cite (ACL):
Ying Su, Zihao Wang, Tianqing Fang, Hongming Zhang, Yangqiu Song, and Tong Zhang. 2022. MICO: A Multi-alternative Contrastive Learning Framework for Commonsense Knowledge Representation. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 1339–1351, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
MICO: A Multi-alternative Contrastive Learning Framework for Commonsense Knowledge Representation (Su et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2022.findings-emnlp.96.pdf