Lianqiang Zhou
2019
Improving Open-Domain Dialogue Systems via Multi-Turn Incomplete Utterance Restoration
Zhufeng Pan
|
Kun Bai
|
Yan Wang
|
Lianqiang Zhou
|
Xiaojiang Liu
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
In multi-turn dialogue, utterances do not always take the full form of sentences. These incomplete utterances will greatly reduce the performance of open-domain dialogue systems. Restoring more incomplete utterances from context could potentially help the systems generate more relevant responses. To facilitate the study of incomplete utterance restoration for open-domain dialogue systems, a large-scale multi-turn dataset Restoration-200K is collected and manually labeled with the explicit relation between an utterance and its context. We also propose a “pick-and-combine” model to restore the incomplete utterance from its context. Experimental results demonstrate that the annotated dataset and the proposed approach significantly boost the response quality of both single-turn and multi-turn dialogue systems.
2018
Context-Sensitive Generation of Open-Domain Conversational Responses
Weinan Zhang
|
Yiming Cui
|
Yifa Wang
|
Qingfu Zhu
|
Lingzhi Li
|
Lianqiang Zhou
|
Ting Liu
Proceedings of the 27th International Conference on Computational Linguistics
Despite the success of existing works on single-turn conversation generation, taking the coherence in consideration, human conversing is actually a context-sensitive process. Inspired by the existing studies, this paper proposed the static and dynamic attention based approaches for context-sensitive generation of open-domain conversational responses. Experimental results on two public datasets show that the proposed static attention based approach outperforms all the baselines on automatic and human evaluation.
Search
Co-authors
- Weinan Zhang 1
- Yiming Cui 1
- Yifa Wang 1
- Qingfu Zhu 1
- Lingzhi Li 1
- show all...