Balanced Multi-Factor In-Context Learning for Multilingual Large Language Models

Masahiro Kaneko, Alham Fikri Aji, Timothy Baldwin


Abstract
Multilingual large language models (MLLMs) are able to leverage in-context learning (ICL) to achieve high performance by leveraging cross-lingual knowledge transfer without parameter updates. However, their effectiveness is highly sensitive to example selection, particularly in multilingual settings. Based on the findings of existing work, three key factors influence multilingual ICL: (1) semantic similarity, (2) linguistic alignment, and (3) language-specific performance. However, existing approaches address these factors independently, without explicitly disentangling their combined impact, leaving optimal example selection underexplored. To address this gap, we propose balanced multi-factor ICL (BMF-ICL), a method that quantifies and optimally balances these factors for improved example selection. Experiments on mCSQA and TYDI across four MLLMs demonstrate that BMF-ICL outperforms existing methods. Further analysis highlights the importance of incorporating all three factors and the importance of selecting examples from multiple languages.
Anthology ID:
2025.emnlp-main.1016
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
20096–20115
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1016/
DOI:
Bibkey:
Cite (ACL):
Masahiro Kaneko, Alham Fikri Aji, and Timothy Baldwin. 2025. Balanced Multi-Factor In-Context Learning for Multilingual Large Language Models. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 20096–20115, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Balanced Multi-Factor In-Context Learning for Multilingual Large Language Models (Kaneko et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1016.pdf
Checklist:
 2025.emnlp-main.1016.checklist.pdf