Are Training Samples Correlated? Learning to Generate Dialogue Responses with Multiple References

Lisong Qiu, Juntao Li, Wei Bi, Dongyan Zhao, Rui Yan


Abstract
Due to its potential applications, open-domain dialogue generation has become popular and achieved remarkable progress in recent years, but sometimes suffers from generic responses. Previous models are generally trained based on 1-to-1 mapping from an input query to its response, which actually ignores the nature of 1-to-n mapping in dialogue that there may exist multiple valid responses corresponding to the same query. In this paper, we propose to utilize the multiple references by considering the correlation of different valid responses and modeling the 1-to-n mapping with a novel two-step generation architecture. The first generation phase extracts the common features of different responses which, combined with distinctive features obtained in the second phase, can generate multiple diverse and appropriate responses. Experimental results show that our proposed model can effectively improve the quality of response and outperform existing neural dialogue models on both automatic and human evaluations.
Anthology ID:
P19-1372
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3826–3835
Language:
URL:
https://aclanthology.org/P19-1372
DOI:
10.18653/v1/P19-1372
Bibkey:
Cite (ACL):
Lisong Qiu, Juntao Li, Wei Bi, Dongyan Zhao, and Rui Yan. 2019. Are Training Samples Correlated? Learning to Generate Dialogue Responses with Multiple References. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 3826–3835, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Are Training Samples Correlated? Learning to Generate Dialogue Responses with Multiple References (Qiu et al., ACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/P19-1372.pdf