Enhancing BERT Fine-Tuning for Sentiment Analysis in Lower-Resourced Languages

Jozef Kubík, Marek Suppa, Martin Takac


Abstract
Limited data for low-resource languages typically yields weaker language models (LMs). Since pre-training is compute-intensive, it is more pragmatic to target improvements during fine-tuning. In this work, we examine the use of Active Learning (AL) methods augmented by structured data selection strategies across epochs, which we term ‘Active Learning schedulers,’ to boost the fine-tuning process with a limited amount of training data. We connect the AL process to data clustering and propose an integrated fine-tuning pipeline that systematically combines AL, data clustering, and dynamic data selection schedulers to enhance models’ performance. Several experiments on the Slovak, Maltese, Icelandic, and Turkish languages show that the use of clustering during the fine-tuning phase together with novel AL scheduling can for models simultaneously yield annotation savings up to 30% and performance improvements up to four F1 score points, while also providing better fine-tuning stability.
Anthology ID:
2025.ijcnlp-short.23
Volume:
Proceedings of the 14th International Joint Conference on Natural Language Processing and the 4th Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics
Month:
December
Year:
2025
Address:
Mumbai, India
Editors:
Kentaro Inui, Sakriani Sakti, Haofen Wang, Derek F. Wong, Pushpak Bhattacharyya, Biplab Banerjee, Asif Ekbal, Tanmoy Chakraborty, Dhirendra Pratap Singh
Venues:
IJCNLP | AACL
SIG:
Publisher:
The Asian Federation of Natural Language Processing and The Association for Computational Linguistics
Note:
Pages:
260–272
Language:
URL:
https://preview.aclanthology.org/ingest-ijcnlp-aacl/2025.ijcnlp-short.23/
DOI:
Bibkey:
Cite (ACL):
Jozef Kubík, Marek Suppa, and Martin Takac. 2025. Enhancing BERT Fine-Tuning for Sentiment Analysis in Lower-Resourced Languages. In Proceedings of the 14th International Joint Conference on Natural Language Processing and the 4th Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics, pages 260–272, Mumbai, India. The Asian Federation of Natural Language Processing and The Association for Computational Linguistics.
Cite (Informal):
Enhancing BERT Fine-Tuning for Sentiment Analysis in Lower-Resourced Languages (Kubík et al., IJCNLP-AACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-ijcnlp-aacl/2025.ijcnlp-short.23.pdf