Kejing He
2020
Dual Dynamic Memory Network for End-to-End Multi-turn Task-oriented Dialog Systems
Jian Wang
|
Junhao Liu
|
Wei Bi
|
Xiaojiang Liu
|
Kejing He
|
Ruifeng Xu
|
Min Yang
Proceedings of the 28th International Conference on Computational Linguistics
Existing end-to-end task-oriented dialog systems struggle to dynamically model long dialog context for interactions and effectively incorporate knowledge base (KB) information into dialog generation. To conquer these limitations, we propose a Dual Dynamic Memory Network (DDMN) for multi-turn dialog generation, which maintains two core components: dialog memory manager and KB memory manager. The dialog memory manager dynamically expands the dialog memory turn by turn and keeps track of dialog history with an updating mechanism, which encourages the model to filter irrelevant dialog history and memorize important newly coming information. The KB memory manager shares the structural KB triples throughout the whole conversation, and dynamically extracts KB information with a memory pointer at each turn. Experimental results on three benchmark datasets demonstrate that DDMN significantly outperforms the strong baselines in terms of both automatic evaluation and human evaluation. Our code is available at https://github.com/siat-nlp/DDMN.
Search
Co-authors
- Jian Wang (王剑) 1
- Junhao Liu 1
- Wei Bi 1
- Xiaojiang Liu 1
- Ruifeng Xu (徐睿峰) 1
- show all...
- Min Yang 1