Connecting the Knowledge Dots: Retrieval-augmented Knowledge Connection for Commonsense Reasoning

Junho Kim, Soyeon Bak, Mingyu Lee, Minju Hong, Songha Kim, Tae-Eui Kam, SangKeun Lee


Abstract
While large language models (LLMs) have achieved remarkable performance across various natural language processing (NLP) tasks, LLMs exhibit a limited understanding of commonsense reasoning due to the necessity of implicit knowledge that is rarely expressed in text. Recently, retrieval-augmented language models (RALMs) have enhanced their commonsense reasoning ability by incorporating background knowledge from external corpora. However, previous RALMs overlook the implicit nature of commonsense knowledge, potentially resulting in the retrieved documents not directly containing information needed to answer questions. In this paper, we propose Retrieval-augmented knowledge Connection, ReConnect, which transforms indirectly relevant documents into a direct explanation to answer the given question. To this end, we extract relevant knowledge from various retrieved document subsets and aggregate them into a direct explanation. Experimental results show that ReConnect outperforms state-of-the-art (SOTA) baselines, achieving improvements of +2.0% and +4.6% average accuracy on in-domain (ID) and out-of-domain (OOD) benchmarks, respectively.
Anthology ID:
2025.emnlp-main.1203
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
23582–23601
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1203/
DOI:
Bibkey:
Cite (ACL):
Junho Kim, Soyeon Bak, Mingyu Lee, Minju Hong, Songha Kim, Tae-Eui Kam, and SangKeun Lee. 2025. Connecting the Knowledge Dots: Retrieval-augmented Knowledge Connection for Commonsense Reasoning. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 23582–23601, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Connecting the Knowledge Dots: Retrieval-augmented Knowledge Connection for Commonsense Reasoning (Kim et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1203.pdf
Checklist:
 2025.emnlp-main.1203.checklist.pdf