@inproceedings{liu-niehues-2025-middle,
    title = "Middle-Layer Representation Alignment for Cross-Lingual Transfer in Fine-Tuned {LLM}s",
    author = "Liu, Danni  and
      Niehues, Jan",
    editor = "Che, Wanxiang  and
      Nabende, Joyce  and
      Shutova, Ekaterina  and
      Pilehvar, Mohammad Taher",
    booktitle = "Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
    month = jul,
    year = "2025",
    address = "Vienna, Austria",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2025.acl-long.778/",
    doi = "10.18653/v1/2025.acl-long.778",
    pages = "15979--15996",
    ISBN = "979-8-89176-251-0",
    abstract = "While large language models demonstrate remarkable capabilities at task-specific applications through fine-tuning, extending these benefits across diverse languages is essential for broad accessibility. However, effective cross-lingual transfer is hindered by LLM performance gaps across languages and the scarcity of fine-tuning data in many languages. Through analysis of LLM internal representations from over 1,000+ language pairs, we discover that middle layers exhibit the strongest potential for cross-lingual alignment. Building on this finding, we propose a middle-layer alignment objective integrated into task-specific training. Our experiments on slot filling, machine translation, and structured text generation show consistent improvements in cross-lingual transfer, especially to lower-resource languages. The method is robust to the choice of alignment languages and generalizes to languages unseen during alignment. Furthermore, we show that separately trained alignment modules can be merged with existing task-specific modules, improving cross-lingual capabilities without full re-training. The code is provided in the supplementary materials."
}Markdown (Informal)
[Middle-Layer Representation Alignment for Cross-Lingual Transfer in Fine-Tuned LLMs](https://preview.aclanthology.org/ingest-emnlp/2025.acl-long.778/) (Liu & Niehues, ACL 2025)
ACL