Multi-level Adaptive Contrastive Learning for Knowledge Internalization in Dialogue Generation

Chenxu Yang, Zheng Lin, Lanrui Wang, Chong Tian, Liang Pang, Jiangnan Li, Qirong Ho, Yanan Cao, Weiping Wang


Abstract
Knowledge-grounded dialogue generation aims to mitigate the issue of text degeneration by incorporating external knowledge to supplement the context. However, the model often fails to internalize this information into responses in a human-like manner. Instead, it simply inserts segments of the provided knowledge into generic responses. As a result, the generated responses tend to be tedious, incoherent, and in lack of interactivity which means the degeneration problem is still unsolved. In this work, we first find that such copying-style degeneration is primarily due to the weak likelihood objective, which allows the model to “cheat” the objective by merely duplicating knowledge segments in a superficial pattern matching based on overlap. To overcome this challenge, we then propose a Multi-level Adaptive Contrastive Learning (MACL) framework that dynamically samples negative examples and subsequently penalizes degeneration behaviors at both the token-level and sequence-level. Extensive experiments on the WoW dataset demonstrate the effectiveness of our approach across various pre-trained models and decoding strategies.
Anthology ID:
2023.emnlp-main.497
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8002–8015
Language:
URL:
https://aclanthology.org/2023.emnlp-main.497
DOI:
10.18653/v1/2023.emnlp-main.497
Bibkey:
Cite (ACL):
Chenxu Yang, Zheng Lin, Lanrui Wang, Chong Tian, Liang Pang, Jiangnan Li, Qirong Ho, Yanan Cao, and Weiping Wang. 2023. Multi-level Adaptive Contrastive Learning for Knowledge Internalization in Dialogue Generation. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 8002–8015, Singapore. Association for Computational Linguistics.
Cite (Informal):
Multi-level Adaptive Contrastive Learning for Knowledge Internalization in Dialogue Generation (Yang et al., EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2023.emnlp-main.497.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-5/2023.emnlp-main.497.mp4