Deciphering Political Entity Sentiment in News with Large Language Models: Zero-Shot and Few-Shot Strategies

Alapan Kuila, Sudeshna Sarkar


Abstract
Sentiment analysis plays a pivotal role in understanding public opinion, particularly in the political domain where the portrayal of entities in news articles influences public perception. In this paper, we investigate the effectiveness of Large Language Models (LLMs) in predicting entity-specific sentiment from political news articles. Leveraging zero-shot and few-shot strategies, we explore the capability of LLMs to discern sentiment towards political entities in news content. Employing a chain-of-thought (COT) approach augmented with rationale in few-shot in-context learning, we assess whether this method enhances sentiment prediction accuracy. Our evaluation on sentiment-labeled datasets demonstrates that LLMs, outperform fine-tuned BERT models in capturing entity-specific sentiment. We find that learning in-context significantly improves model performance, while the self-consistency mechanism enhances consistency in sentiment prediction. Despite the promising results, we observe inconsistencies in the effectiveness of the COT prompting method. Overall, our findings underscore the potential of LLMs in entity-centric sentiment analysis within the political news domain and highlight the importance of suitable prompting strategies and model architectures.
Anthology ID:
2024.politicalnlp-1.1
Volume:
Proceedings of the Second Workshop on Natural Language Processing for Political Sciences @ LREC-COLING 2024
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Haithem Afli, Houda Bouamor, Cristina Blasi Casagran, Sahar Ghannay
Venues:
PoliticalNLP | WS
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
1–11
Language:
URL:
https://aclanthology.org/2024.politicalnlp-1.1
DOI:
Bibkey:
Cite (ACL):
Alapan Kuila and Sudeshna Sarkar. 2024. Deciphering Political Entity Sentiment in News with Large Language Models: Zero-Shot and Few-Shot Strategies. In Proceedings of the Second Workshop on Natural Language Processing for Political Sciences @ LREC-COLING 2024, pages 1–11, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Deciphering Political Entity Sentiment in News with Large Language Models: Zero-Shot and Few-Shot Strategies (Kuila & Sarkar, PoliticalNLP-WS 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2024.politicalnlp-1.1.pdf