Abstract
In this paper, we introduce COCONUT to effectively guide the contextualization of structured commonsense knowledge based on largelanguage models. COCONUT employs a contextualized knowledge prompting scheme to gather high-quality contextualization examplesfrom a large language model. These examples are subsequently distilled into small language models to enhance their contextualization capability. Extensive evaluations show that COCONUT considerably improves commonsense reasoning performance across diverse benchmarks, models, and settings, exhibiting its flexibility and universality in generating contextualized commonsense knowledge. Notably,COCONUT consistently outperforms the state-of-the-art technique by an average of 5.8%.- Anthology ID:
- 2024.findings-acl.346
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2024
- Month:
- August
- Year:
- 2024
- Address:
- Bangkok, Thailand
- Editors:
- Lun-Wei Ku, Andre Martins, Vivek Srikumar
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 5815–5830
- Language:
- URL:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-acl.346/
- DOI:
- 10.18653/v1/2024.findings-acl.346
- Cite (ACL):
- Jun-Hyung Park, Mingyu Lee, Junho Kim, and SangKeun Lee. 2024. Coconut: Contextualized Commonsense Unified Transformers for Graph-Based Commonsense Augmentation of Language Models. In Findings of the Association for Computational Linguistics: ACL 2024, pages 5815–5830, Bangkok, Thailand. Association for Computational Linguistics.
- Cite (Informal):
- Coconut: Contextualized Commonsense Unified Transformers for Graph-Based Commonsense Augmentation of Language Models (Park et al., Findings 2024)
- PDF:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-acl.346.pdf