EfficientXLang: Towards Improving Token Efficiency Through Cross-Lingual Reasoning

Sanchit Ahuja, Praneetha Vaddamanu, Barun Patra


Abstract
Despite recent advances in Reasoning Language Models (RLMs), most research focuses solely on English, even though many models are pretrained on multilingual data. In this work, we investigate: Is English the most token-efficient language for reasoning? We evaluate three open-source RLMs: DeepSeek R1, Qwen 2.5, and Qwen 3, across four math datasets and seven typologically diverse languages. We find that reasoning in non-English languages not only reduces token usage, but also preserves accuracy. These gains persist even after translating the reasoning traces into English, suggesting genuine shifts in reasoning behavior rather than surface-level linguistic effects. The extent of improvement, however, depends on the model’s multilingual strength. Our findings motivate a broader view of reasoning in language models, highlighting the potential of multilingual reasoning and the importance of strong multilingual foundations. The code for our work can be found: https://github.com/microsoft/EfficientXLang.
Anthology ID:
2025.findings-emnlp.845
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15612–15624
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.845/
DOI:
10.18653/v1/2025.findings-emnlp.845
Bibkey:
Cite (ACL):
Sanchit Ahuja, Praneetha Vaddamanu, and Barun Patra. 2025. EfficientXLang: Towards Improving Token Efficiency Through Cross-Lingual Reasoning. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 15612–15624, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
EfficientXLang: Towards Improving Token Efficiency Through Cross-Lingual Reasoning (Ahuja et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.845.pdf
Checklist:
 2025.findings-emnlp.845.checklist.pdf