Zero-Shot Multi-Label Classification of Bangla Documents: Large Decoders Vs. Classic Encoders

Souvika Sarkar, Md Najib Hasan, Santu Karmaker


Abstract
Bangla, a language spoken by over 300 million native speakers and ranked as the sixth most spoken language worldwide, presents unique challenges in natural language processing (NLP) due to its complex morphological characteristics and limited resources. Although recent large-decoder-based LLMs, such as GPT, LLaMA, and DeepSeek, have demonstrated excellent performance across many NLP tasks, their effectiveness in Bangla remains largely unexplored. In this paper, we establish the first benchmark comparing large decoder-based LLMs with classic encoder-based models for the Zero-Shot Multi-Label Classification (Zero-Shot-MLC) task in Bangla. Our evaluation of 32 state-of-the-art models reveals that existing so-called powerful encoders and decoders still struggle to achieve high accuracy on the Bangla Zero-Shot-MLC task, suggesting a need for more research and resources for Bangla NLP.
Anthology ID:
2025.banglalp-1.7
Volume:
Proceedings of the Second Workshop on Bangla Language Processing (BLP-2025)
Month:
December
Year:
2025
Address:
Mumbai, India
Editors:
Firoj Alam, Sudipta Kar, Shammur Absar Chowdhury, Naeemul Hassan, Enamul Hoque Prince, Mohiuddin Tasnim, Md Rashad Al Hasan Rony, Md Tahmid Rahman Rahman
Venues:
BanglaLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
91–106
Language:
URL:
https://preview.aclanthology.org/ingest-ijcnlp-aacl/2025.banglalp-1.7/
DOI:
Bibkey:
Cite (ACL):
Souvika Sarkar, Md Najib Hasan, and Santu Karmaker. 2025. Zero-Shot Multi-Label Classification of Bangla Documents: Large Decoders Vs. Classic Encoders. In Proceedings of the Second Workshop on Bangla Language Processing (BLP-2025), pages 91–106, Mumbai, India. Association for Computational Linguistics.
Cite (Informal):
Zero-Shot Multi-Label Classification of Bangla Documents: Large Decoders Vs. Classic Encoders (Sarkar et al., BanglaLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-ijcnlp-aacl/2025.banglalp-1.7.pdf