Attribute Controlled Dialogue Prompting
Runcheng Liu, Ahmad Rashid, Ivan Kobyzev, Mehdi Rezagholizadeh, Pascal Poupart
Abstract
Prompt-tuning has become an increasingly popular parameter-efficient method for adapting large pretrained language models to downstream tasks. However, both discrete prompting and continuous prompting assume fixed prompts for all data samples within a task, neglecting the fact that inputs vary greatly in some tasks such as open-domain dialogue generation. In this paper, we present a novel, instance-specific prompt-tuning algorithm for dialogue generation. Specifically, we generate prompts based on instance-level control code, rather than the conversation history, to explore their impact on controlled dialogue generation. Experiments on popular open-domain dialogue datasets, evaluated on both automated metrics and human evaluation, demonstrate that our method is superior to prompting baselines and comparable to fine-tuning with only 5%-6% of total parameters.- Anthology ID:
- 2023.findings-acl.150
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2023
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2380–2389
- Language:
- URL:
- https://aclanthology.org/2023.findings-acl.150
- DOI:
- 10.18653/v1/2023.findings-acl.150
- Cite (ACL):
- Runcheng Liu, Ahmad Rashid, Ivan Kobyzev, Mehdi Rezagholizadeh, and Pascal Poupart. 2023. Attribute Controlled Dialogue Prompting. In Findings of the Association for Computational Linguistics: ACL 2023, pages 2380–2389, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Attribute Controlled Dialogue Prompting (Liu et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/2023.findings-acl.150.pdf