Why the Unexpected? Dissecting the Political and Economic Bias in Persian Small and Large Language Models
Ehsan Barkhordar, Surendrabikram Thapa, Ashwarya Maratha, Usman Naseem
Abstract
Recently, language models (LMs) like BERT and large language models (LLMs) like GPT-4 have demonstrated potential in various linguistic tasks such as text generation, translation, and sentiment analysis. However, these abilities come with a cost of a risk of perpetuating biases from their training data. Political and economic inclinations play a significant role in shaping these biases. Thus, this research aims to understand political and economic biases in Persian LMs and LLMs, addressing a significant gap in AI ethics and fairness research. Focusing on the Persian language, our research employs a two-step methodology. First, we utilize the political compass test adapted to Persian. Second, we analyze biases present in these models. Our findings indicate the presence of nuanced biases, underscoring the importance of ethical considerations in AI deployments within Persian-speaking contexts.- Anthology ID:
- 2024.sigul-1.49
- Volume:
- Proceedings of the 3rd Annual Meeting of the Special Interest Group on Under-resourced Languages @ LREC-COLING 2024
- Month:
- May
- Year:
- 2024
- Address:
- Torino, Italia
- Editors:
- Maite Melero, Sakriani Sakti, Claudia Soria
- Venues:
- SIGUL | WS
- SIG:
- Publisher:
- ELRA and ICCL
- Note:
- Pages:
- 410–420
- Language:
- URL:
- https://aclanthology.org/2024.sigul-1.49
- DOI:
- Cite (ACL):
- Ehsan Barkhordar, Surendrabikram Thapa, Ashwarya Maratha, and Usman Naseem. 2024. Why the Unexpected? Dissecting the Political and Economic Bias in Persian Small and Large Language Models. In Proceedings of the 3rd Annual Meeting of the Special Interest Group on Under-resourced Languages @ LREC-COLING 2024, pages 410–420, Torino, Italia. ELRA and ICCL.
- Cite (Informal):
- Why the Unexpected? Dissecting the Political and Economic Bias in Persian Small and Large Language Models (Barkhordar et al., SIGUL-WS 2024)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/2024.sigul-1.49.pdf