What Language(s) Does Aya-23 Think In? How Multilinguality Affects Internal Language Representations

Katharina A. T. T. Trinley, Toshiki Nakai, Tatiana Anikina, Tanja Baeumel


Abstract
Large language models (LLMs) excel at multilingual tasks, yet their internal language processing remains poorly understood. We analyze how Aya-23-8B, a decoder-only LLM trained on balanced multilingual data, handles code-mixed, cloze, and translation tasks compared to predominantly monolingual models like Llama 3 and Chinese-LLaMA-2. Using logit lens and neuron specialization analyses, we find: (1) Aya-23 activates typologically related language representations during translation, unlike English-centric models that rely on a single pivot language; (2) code-mixed neuron activation patterns vary with mixing rates and are shaped more by the base language than the mixed-in one; and (3) Aya-23’s language-specific neurons for code-mixed inputs concentrate in final layers, diverging from prior findings on decoder-only models. Neuron overlap analysis further shows that script similarity and typological relations impact processing across model types. These findings reveal how multilingual training shapes LLM internals and inform future cross-lingual transfer research. The code and dataset are publicly available.
Anthology ID:
2025.globalnlp-1.18
Volume:
Proceedings of the Workshop on Beyond English: Natural Language Processing for all Languages in an Era of Large Language Models
Month:
September
Year:
2025
Address:
Varna, Bulgaria
Editors:
Sudhansu Bala Das, Pruthwik Mishra, Alok Singh, Shamsuddeen Hassan Muhammad, Asif Ekbal, Uday Kumar Das
Venues:
GlobalNLP | WS
SIG:
Publisher:
INCOMA Ltd., Shoumen, BULGARIA
Note:
Pages:
159–171
Language:
URL:
https://preview.aclanthology.org/corrections-2026-01/2025.globalnlp-1.18/
DOI:
Bibkey:
Cite (ACL):
Katharina A. T. T. Trinley, Toshiki Nakai, Tatiana Anikina, and Tanja Baeumel. 2025. What Language(s) Does Aya-23 Think In? How Multilinguality Affects Internal Language Representations. In Proceedings of the Workshop on Beyond English: Natural Language Processing for all Languages in an Era of Large Language Models, pages 159–171, Varna, Bulgaria. INCOMA Ltd., Shoumen, BULGARIA.
Cite (Informal):
What Language(s) Does Aya-23 Think In? How Multilinguality Affects Internal Language Representations (Trinley et al., GlobalNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/corrections-2026-01/2025.globalnlp-1.18.pdf