Contextual Knowledge Learning for Dialogue Generation

Wen Zheng, Natasa Milic-Frayling, Ke Zhou


Abstract
Incorporating conversational context and knowledge into dialogue generation models has been essential for improving the quality of the generated responses. The context, comprising utterances from previous dialogue exchanges, is used as a source of content for response generation and as a means of selecting external knowledge. However, to avoid introducing irrelevant content, it is key to enable fine-grained scoring of context and knowledge. In this paper, we present a novel approach to context and knowledge weighting as an integral part of model training. We guide the model training through a Contextual Knowledge Learning (CKL) process which involves Latent Vectors for context and knowledge, respectively. CKL Latent Vectors capture the relationship between context, knowledge, and responses through weak supervision and enable differential weighting of context utterances and knowledge sentences during the training process. Experiments with two standard datasets and human evaluation demonstrate that CKL leads to a significant improvement compared with the performance of six strong baseline models and shows robustness with regard to reduced sizes of training sets.
Anthology ID:
2023.acl-long.433
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7822–7839
Language:
URL:
https://aclanthology.org/2023.acl-long.433
DOI:
10.18653/v1/2023.acl-long.433
Bibkey:
Cite (ACL):
Wen Zheng, Natasa Milic-Frayling, and Ke Zhou. 2023. Contextual Knowledge Learning for Dialogue Generation. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 7822–7839, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Contextual Knowledge Learning for Dialogue Generation (Zheng et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.acl-long.433.pdf
Video:
 https://preview.aclanthology.org/dois-2013-emnlp/2023.acl-long.433.mp4