CoLV: A Collaborative Latent Variable Model for Knowledge-Grounded Dialogue Generation

Haolan Zhan, Lei Shen, Hongshen Chen, Hainan Zhang


Abstract
Knowledge-grounded dialogue generation has achieved promising performance with the engagement of external knowledge sources. Typical approaches towards this task usually perform relatively independent two sub-tasks, i.e., knowledge selection and knowledge-aware response generation. In this paper, in order to improve the diversity of both knowledge selection and knowledge-aware response generation, we propose a collaborative latent variable (CoLV) model to integrate these two aspects simultaneously in separate yet collaborative latent spaces, so as to capture the inherent correlation between knowledge selection and response generation. During generation, our proposed model firstly draws knowledge candidate from the latent space conditioned on the dialogue context, and then samples a response from another collaborative latent space conditioned on both the context and the selected knowledge. Experimental results on two widely-used knowledge-grounded dialogue datasets show that our model outperforms previous methods on both knowledge selection and response generation.
Anthology ID:
2021.emnlp-main.172
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2250–2261
Language:
URL:
https://aclanthology.org/2021.emnlp-main.172
DOI:
10.18653/v1/2021.emnlp-main.172
Bibkey:
Cite (ACL):
Haolan Zhan, Lei Shen, Hongshen Chen, and Hainan Zhang. 2021. CoLV: A Collaborative Latent Variable Model for Knowledge-Grounded Dialogue Generation. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 2250–2261, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
CoLV: A Collaborative Latent Variable Model for Knowledge-Grounded Dialogue Generation (Zhan et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.emnlp-main.172.pdf
Video:
 https://preview.aclanthology.org/ingestion-script-update/2021.emnlp-main.172.mp4
Data
Holl-EWizard of Wikipedia