ELECTRA and GPT-4o: Cost-Effective Partners for Sentiment Analysis

James P. Beno


Abstract
Bidirectional transformers excel at sentiment analysis, and Large Language Models (LLM) are effective zero-shot learners. Might they perform better as a team? This paper explores collaborative approaches between ELECTRA and GPT-4o for three-way sentiment classification. We fine-tuned (FT) four models (ELECTRA Base/Large, GPT-4o/4o-mini) using a mix of reviews from Stanford Sentiment Treebank (SST) and DynaSent. We provided input from ELECTRA to GPT as: predicted label, probabilities, and retrieved examples. Sharing ELECTRA Base FT predictions with GPT-4o-mini significantly improved performance over either model alone (82.50 macro F1 vs. 79.14 ELECTRA Base FT, 79.41 GPT-4o-mini) and yielded the lowest cost/performance ratio ($0.12/F1 point). However, when GPT models were fine-tuned, including predictions decreased performance. GPT-4o FT-M was the top performer (86.99), with GPT-4o-mini FT close behind (86.70) at much less cost ($0.38 vs. $1.59/F1 point). Our results show that augmenting prompts with predictions from fine-tuned encoders is an efficient way to boost performance, and a fine-tuned GPT-4o-mini is nearly as good as GPT-4o FT at 76% less cost. Both are affordable options for projects with limited resources.
Anthology ID:
2025.knowledgenlp-1.2
Volume:
Proceedings of the 4th International Workshop on Knowledge-Augmented Methods for Natural Language Processing
Month:
May
Year:
2025
Address:
Albuquerque, New Mexico, USA
Editors:
Weijia Shi, Wenhao Yu, Akari Asai, Meng Jiang, Greg Durrett, Hannaneh Hajishirzi, Luke Zettlemoyer
Venues:
KnowledgeNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
18–36
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2025.knowledgenlp-1.2/
DOI:
Bibkey:
Cite (ACL):
James P. Beno. 2025. ELECTRA and GPT-4o: Cost-Effective Partners for Sentiment Analysis. In Proceedings of the 4th International Workshop on Knowledge-Augmented Methods for Natural Language Processing, pages 18–36, Albuquerque, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
ELECTRA and GPT-4o: Cost-Effective Partners for Sentiment Analysis (Beno, KnowledgeNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2025.knowledgenlp-1.2.pdf