Impact of Task Adapting on Transformer Models for Targeted Sentiment Analysis in Croatian Headlines

Sofia Lee, Jelke Bloem


Abstract
Transformer models, such as BERT, are often taken off-the-shelf and then fine-tuned on a downstream task. Although this is sufficient for many tasks, low-resource settings require special attention. We demonstrate an approach of performing an extra stage of self-supervised task-adaptive pre-training to a number of Croatian-supporting Transformer models. In particular, we focus on approaches to language, domain, and task adaptation. The task in question is targeted sentiment analysis for Croatian news headlines. We produce new state-of-the-art results (F1 = 0.781), but the highest performing model still struggles with irony and implicature. Overall, we find that task-adaptive pre-training benefits massively multilingual models but not Croatian-dominant models.
Anthology ID:
2024.lrec-main.760
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
8662–8674
Language:
URL:
https://aclanthology.org/2024.lrec-main.760
DOI:
Bibkey:
Cite (ACL):
Sofia Lee and Jelke Bloem. 2024. Impact of Task Adapting on Transformer Models for Targeted Sentiment Analysis in Croatian Headlines. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 8662–8674, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Impact of Task Adapting on Transformer Models for Targeted Sentiment Analysis in Croatian Headlines (Lee & Bloem, LREC-COLING 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2024.lrec-main.760.pdf