A Simple Yet Strong Domain-Agnostic De-bias Method for Zero-Shot Sentiment Classification

Yang Zhao, Tetsuya Nasukawa, Masayasu Muraoka, Bishwaranjan Bhattacharjee


Abstract
Zero-shot prompt-based learning has made much progress in sentiment analysis, and considerable effort has been dedicated to designing high-performing prompt templates. However, two problems exist; First, large language models are often biased to their pre-training data, leading to poor performance in prompt templates that models have rarely seen. Second, in order to adapt to different domains, re-designing prompt templates is usually required, which is time-consuming and inefficient. To remedy both shortcomings, we propose a simple yet strong data construction method to de-bias a given prompt template, yielding a large performance improvement in sentiment analysis tasks across different domains, pre-trained language models, and prompt templates. Also, we demonstrate the advantage of using domain-agnostic generic responses over the in-domain ground-truth data.
Anthology ID:
2023.findings-acl.242
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3923–3931
Language:
URL:
https://aclanthology.org/2023.findings-acl.242
DOI:
10.18653/v1/2023.findings-acl.242
Bibkey:
Cite (ACL):
Yang Zhao, Tetsuya Nasukawa, Masayasu Muraoka, and Bishwaranjan Bhattacharjee. 2023. A Simple Yet Strong Domain-Agnostic De-bias Method for Zero-Shot Sentiment Classification. In Findings of the Association for Computational Linguistics: ACL 2023, pages 3923–3931, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
A Simple Yet Strong Domain-Agnostic De-bias Method for Zero-Shot Sentiment Classification (Zhao et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/2023.findings-acl.242.pdf
Video:
 https://preview.aclanthology.org/emnlp22-frontmatter/2023.findings-acl.242.mp4