Yahui Liu
2021
Assessing Dialogue Systems with Distribution Distances
Jiannan Xiang
|
Yahui Liu
|
Deng Cai
|
Huayang Li
|
Defu Lian
|
Lemao Liu
Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021
2018
Towards Less Generic Responses in Neural Conversation Models: A Statistical Re-weighting Method
Yahui Liu
|
Wei Bi
|
Jun Gao
|
Xiaojiang Liu
|
Jian Yao
|
Shuming Shi
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Sequence-to-sequence neural generation models have achieved promising performance on short text conversation tasks. However, they tend to generate generic/dull responses, leading to unsatisfying dialogue experience. We observe that in the conversation tasks, each query could have multiple responses, which forms a 1-to-n or m-to-n relationship in the view of the total corpus. The objective function used in standard sequence-to-sequence models will be dominated by loss terms with generic patterns. Inspired by this observation, we introduce a statistical re-weighting method that assigns different weights for the multiple responses of the same query, and trains the common neural generation model with the weights. Experimental results on a large Chinese dialogue corpus show that our method improves the acceptance rate of generated responses compared with several baseline models and significantly reduces the number of generated generic responses.
Search
Co-authors
- Wei Bi 1
- Jun Gao 1
- Xiaojiang Liu 1
- Jian Yao 1
- Shuming Shi 1
- show all...