Yilei Tu


2025

pdf bib
Blessing of Multilinguality: A Systematic Analysis of Multilingual In-Context Learning
Yilei Tu | Andrew Xue | Freda Shi
Findings of the Association for Computational Linguistics: ACL 2025

While multilingual large language models generally perform adequately, and sometimes even rival English performance on high-resource languages (HRLs), they often significantly underperform on low-resource languages (LRLs). Among several prompting strategies aiming at bridging the gap, multilingual in-context learning (ICL) has been particularly effective when demonstration in target languages is unavailable. However, there lacks a systematic understanding when and why it works well.In this work, we systematically analyze multilingual ICL, using demonstrations in HRLs to enhance cross-lingual transfer. We show that demonstrations in mixed HRLs consistently outperform English-only ones across the board, particularly for tasks written in LRLs. Surprisingly, our ablation study show that the presence of irrelevant non-English sentences in the prompt yields measurable gains, suggesting the effectiveness of multilingual exposure itself. Our results highlight the potential of strategically leveraging multilingual resources to bridge the performance gap for underrepresented languages.

2024

pdf bib
AI Support Systems for Academic Research
Susie Rao | Noah Mamié | Yilei Tu | Prakhar Bhandar
Proceedings of the 9th edition of the Swiss Text Analytics Conference

2023

pdf bib
SAINE: Scientific Annotation and Inference Engine of Scientific Research
Susie Xi Rao | Yilei Tu | Peter H. Egger
Proceedings of the 13th International Joint Conference on Natural Language Processing and the 3rd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics: System Demonstrations