Analyzing and Adapting Large Language Models for Few-Shot Multilingual NLU: Are We There Yet?

Evgeniia Razumovskaia, Ivan Vulić, Anna Korhonen


Abstract
Supervised fine-tuning (SFT), supervised instruction tuning (SIT), and in-context learning (ICL) are three alternative, de facto standard approaches to few-shot learning. ICL has gained popularity recently with the advent of LLMs due to its versatile simplicity and sample efficiency. Prior research has conducted only limited investigation into how these approaches work for multilingual few-shot learning, and the focus so far has been mostly on their performance. In this work, we present an extensive and systematic comparison of the three approaches, testing them on a variety of high- and low-resource languages over five different NLU tasks, and a myriad of language and domain setups. Importantly, performance is only one aspect of the comparison, where we also analyze and discuss the approaches through the optics of their computational, inference and financial costs. Some of the highlighted findings concern an excellent trade-off between performance and resource requirements/cost for SIT. We further analyze the impact of target language adaptation of pretrained LLMs and find that the standard adaptation approaches can (superficially) improve target language generation capabilities, but language understanding elicited through ICL does not improve accordingly and remains limited, especially for low-resource languages.
Anthology ID:
2025.tacl-1.51
Volume:
Transactions of the Association for Computational Linguistics, Volume 13
Month:
Year:
2025
Address:
Cambridge, MA
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
1096–1120
Language:
URL:
https://preview.aclanthology.org/fix-opsupmap-display/2025.tacl-1.51/
DOI:
10.1162/tacl.a.33
Bibkey:
Cite (ACL):
Evgeniia Razumovskaia, Ivan Vulić, and Anna Korhonen. 2025. Analyzing and Adapting Large Language Models for Few-Shot Multilingual NLU: Are We There Yet?. Transactions of the Association for Computational Linguistics, 13:1096–1120.
Cite (Informal):
Analyzing and Adapting Large Language Models for Few-Shot Multilingual NLU: Are We There Yet? (Razumovskaia et al., TACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-opsupmap-display/2025.tacl-1.51.pdf