Exploring the Role of Transliteration in In-Context Learning for Low-resource Languages Written in Non-Latin Scripts

Chunlan Ma, Yihong Liu, Haotian Ye, Hinrich Schuetze


Abstract
Decoder-only large language models (LLMs) excel in high-resource languages across various tasks through few-shot or even zero-shot in-context learning (ICL). However, their performance often does not transfer well to low-resource languages, especially those written in non-Latin scripts. Inspired by recent work that leverages transliteration in encoder-only models, we investigate whether transliteration is also effective in improving LLMs’ performance for low-resource languages written in non-Latin scripts. To this end, we propose three prompt templates, where the target-language text is represented in (1) its original script, (2) Latin script transliteration, or (3) combined. We apply these methods to several representative LLMs of different sizes on various tasks including text classification and sequential labeling. Our findings show that the effectiveness of transliteration varies by task type and model size. For instance, all models benefit from transliterations for sequential labeling (with increases of up to 25%%).
Anthology ID:
2025.mrl-main.27
Volume:
Proceedings of the 5th Workshop on Multilingual Representation Learning (MRL 2025)
Month:
November
Year:
2025
Address:
Suzhuo, China
Editors:
David Ifeoluwa Adelani, Catherine Arnett, Duygu Ataman, Tyler A. Chang, Hila Gonen, Rahul Raja, Fabian Schmidt, David Stap, Jiayi Wang
Venues:
MRL | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
397–410
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.mrl-main.27/
DOI:
Bibkey:
Cite (ACL):
Chunlan Ma, Yihong Liu, Haotian Ye, and Hinrich Schuetze. 2025. Exploring the Role of Transliteration in In-Context Learning for Low-resource Languages Written in Non-Latin Scripts. In Proceedings of the 5th Workshop on Multilingual Representation Learning (MRL 2025), pages 397–410, Suzhuo, China. Association for Computational Linguistics.
Cite (Informal):
Exploring the Role of Transliteration in In-Context Learning for Low-resource Languages Written in Non-Latin Scripts (Ma et al., MRL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.mrl-main.27.pdf