Blessing of Multilinguality: A Systematic Analysis of Multilingual In-Context Learning

Yilei Tu, Andrew Xue, Freda Shi


Abstract
While multilingual large language models generally perform adequately, and sometimes even rival English performance on high-resource languages (HRLs), they often significantly underperform on low-resource languages (LRLs). Among several prompting strategies aiming at bridging the gap, multilingual in-context learning (ICL) has been particularly effective when demonstration in target languages is unavailable. However, there lacks a systematic understanding when and why it works well.In this work, we systematically analyze multilingual ICL, using demonstrations in HRLs to enhance cross-lingual transfer. We show that demonstrations in mixed HRLs consistently outperform English-only ones across the board, particularly for tasks written in LRLs. Surprisingly, our ablation study show that the presence of irrelevant non-English sentences in the prompt yields measurable gains, suggesting the effectiveness of multilingual exposure itself. Our results highlight the potential of strategically leveraging multilingual resources to bridge the performance gap for underrepresented languages.
Anthology ID:
2025.findings-acl.323
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6213–6248
Language:
URL:
https://preview.aclanthology.org/mtsummit-25-ingestion/2025.findings-acl.323/
DOI:
10.18653/v1/2025.findings-acl.323
Bibkey:
Cite (ACL):
Yilei Tu, Andrew Xue, and Freda Shi. 2025. Blessing of Multilinguality: A Systematic Analysis of Multilingual In-Context Learning. In Findings of the Association for Computational Linguistics: ACL 2025, pages 6213–6248, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Blessing of Multilinguality: A Systematic Analysis of Multilingual In-Context Learning (Tu et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/mtsummit-25-ingestion/2025.findings-acl.323.pdf