State-Aware Adversarial Training for Utterance-Level Dialogue Generation

Yi Huang, Xiaoting Wu, Wei Hu, Junlan Feng, Chao Deng


Abstract
Dialogue generation is a challenging problem because it not only requires us to model the context in a conversation but also to exploit it to generate a coherent and fluent utterance. This paper, aiming for a specific topic of this field, proposes an adversarial training based framework for utterance-level dialogue generation. Technically, we train an encoder-decoder generator simultaneously with a discriminative classifier that make the utterance approximate to the state-aware inputs. Experiments on MultiWoZ 2.0 and MultiWoZ 2.1 datasets show that our method achieves advanced improvements on both automatic and human evaluations, and on the effectiveness of our framework facing low-resource. We further explore the effect of fine-grained augmentations for downstream dialogue state tracking (DST) tasks. Experimental results demonstrate the high-quality data generated by our proposed framework improves the performance over state-of-the-art models.
Anthology ID:
2022.seretod-1.8
Volume:
Proceedings of the Towards Semi-Supervised and Reinforced Task-Oriented Dialog Systems (SereTOD)
Month:
December
Year:
2022
Address:
Abu Dhabi, Beijing (Hybrid)
Editors:
Zhijian Ou, Junlan Feng, Juanzi Li
Venue:
SereTOD
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
62–74
Language:
URL:
https://aclanthology.org/2022.seretod-1.8
DOI:
10.18653/v1/2022.seretod-1.8
Bibkey:
Cite (ACL):
Yi Huang, Xiaoting Wu, Wei Hu, Junlan Feng, and Chao Deng. 2022. State-Aware Adversarial Training for Utterance-Level Dialogue Generation. In Proceedings of the Towards Semi-Supervised and Reinforced Task-Oriented Dialog Systems (SereTOD), pages 62–74, Abu Dhabi, Beijing (Hybrid). Association for Computational Linguistics.
Cite (Informal):
State-Aware Adversarial Training for Utterance-Level Dialogue Generation (Huang et al., SereTOD 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2022.seretod-1.8.pdf
Video:
 https://preview.aclanthology.org/ingest-acl-2023-videos/2022.seretod-1.8.mp4