Efficient Cross-Task Prompt Tuning for Few-Shot Conversational Emotion Recognition

Yige Xu, Zhiwei Zeng, Zhiqi Shen


Abstract
Emotion Recognition in Conversation (ERC) has been widely studied due to its importance in developing emotion-aware empathetic machines. The rise of pre-trained language models (PLMs) has further pushed the limit of ERC performance. However, most recent works on ERC using PLMs are heavily data-driven, and requires fine-tuning the entire PLMs. To improve both sample and computational efficiency, we propose a derivative-free optimization method called Cross-Task Prompt Tuning (CTPT) for few-shot conversational emotion recognition. Unlike existing methods that learn independent knowledge from individual tasks, CTPT leverages sharable cross-task knowledge by exploiting external knowledge from other source tasks to improve learning performance under the few-shot setting. Moreover, CTPT only needs to optimize a vector under the low intrinsic dimensionality without gradient, which is highly parameter-efficient compared with existing approaches. Experiments on five different contextual conversation datasets demonstrate that our CTPT method has superior results on both few-shot scenarios and zero-shot transfers.
Anthology ID:
2023.findings-emnlp.780
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11654–11666
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.780
DOI:
10.18653/v1/2023.findings-emnlp.780
Bibkey:
Cite (ACL):
Yige Xu, Zhiwei Zeng, and Zhiqi Shen. 2023. Efficient Cross-Task Prompt Tuning for Few-Shot Conversational Emotion Recognition. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 11654–11666, Singapore. Association for Computational Linguistics.
Cite (Informal):
Efficient Cross-Task Prompt Tuning for Few-Shot Conversational Emotion Recognition (Xu et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2023.findings-emnlp.780.pdf