Stylized Dialogue Generation with Feature-Guided Knowledge Augmentation
Jinpeng Li, Zekai Zhang, Xiuying Chen, Dongyan Zhao, Rui Yan
Abstract
Stylized dialogue generation systems aim to produce coherent and context-aware dialogues while effectively emulating the desired style. Generating stylized dialogue is valuable yet challenging due to the scarce parallel data. Existing methods often synthesize pseudo data through back translation, yet suffer from noisy and context-agnostic style signals caused by insufficient guidance on target style features. To address this, we propose the knowledge-augmented stylized dialogue generation model, which includes a feature-guided style knowledge selection module that utilizes context and response features. Specifically, we retrieve dialogue-related style sentences from style corpus to explicitly provide clear style signals. We design a feature-guided selection module with response-related contrastive learning and style responsiveness Kullback-Leibler losses to enhance generation at both semantic and stylized levels. Our approach demonstrates satisfactory performance on two public stylized dialogue benchmarks in both automatic and human evaluations.- Anthology ID:
- 2023.findings-emnlp.475
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 7144–7157
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.475
- DOI:
- 10.18653/v1/2023.findings-emnlp.475
- Cite (ACL):
- Jinpeng Li, Zekai Zhang, Xiuying Chen, Dongyan Zhao, and Rui Yan. 2023. Stylized Dialogue Generation with Feature-Guided Knowledge Augmentation. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 7144–7157, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Stylized Dialogue Generation with Feature-Guided Knowledge Augmentation (Li et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/ingest-2024-clasp/2023.findings-emnlp.475.pdf