A Simple-Yet-Efficient Instruction Augmentation Method for Zero-Shot Sentiment Classification
Yang Zhao, Masayasu Muraoka, Issei Yoshida, Bishwaranjan Bhattacharjee, Hiroshi Kanayama
Abstract
Instruction tuning significantly enhances the performance of large language models in tasks such as sentiment classification. Previous studies have leveraged labeled instances from sentiment benchmark datasets to instruction-tune LLMs, improving zero-shot sentiment classification performance. In this work, we propose a simple-yet-efficient instruction augmentation method which does not rely on any actual labeled sentiment instances. With just 240 pseudo instruction instances, the proposed method significantly improve the classification performance across several LLMs on 12 benchmark datasets, increasing scores by 30 points and outperforming LLMs that utilize more complex instruction tuning methods by 5.1 points. Surprisingly, the models tuned with 240 pseudo-instructions even outperform those tuned with actual domain-specific instruction instances. Despite method’s simplicity, our further analysis suggests that the probability shift toward the positive and negative classes and its generalization ability may be the primary driver of the improvement.- Anthology ID:
- 2025.coling-main.107
- Volume:
- Proceedings of the 31st International Conference on Computational Linguistics
- Month:
- January
- Year:
- 2025
- Address:
- Abu Dhabi, UAE
- Editors:
- Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
- Venue:
- COLING
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1585–1599
- Language:
- URL:
- https://preview.aclanthology.org/jlcl-multiple-ingestion/2025.coling-main.107/
- DOI:
- Cite (ACL):
- Yang Zhao, Masayasu Muraoka, Issei Yoshida, Bishwaranjan Bhattacharjee, and Hiroshi Kanayama. 2025. A Simple-Yet-Efficient Instruction Augmentation Method for Zero-Shot Sentiment Classification. In Proceedings of the 31st International Conference on Computational Linguistics, pages 1585–1599, Abu Dhabi, UAE. Association for Computational Linguistics.
- Cite (Informal):
- A Simple-Yet-Efficient Instruction Augmentation Method for Zero-Shot Sentiment Classification (Zhao et al., COLING 2025)
- PDF:
- https://preview.aclanthology.org/jlcl-multiple-ingestion/2025.coling-main.107.pdf