Luth: Efficient French Specialization for Small Language Models and Cross-Lingual Transfer

Maxence Lasbordes, Sinoué Gad


Abstract
The landscape of Large Language Models remains predominantly English-centric, resulting in a significant performance gap for other major languages, such as French, especially in the context of Small Language Models (SLMs). Existing multilingual models demonstrate considerably lower performance in French compared to English, and research on efficient adaptation methods for French remains limited. To address this, we introduce Luth, a family of French-specialized SLMs: through targeted post-training on curated, high-quality French data, our models outperform all open-source counterparts of comparable size on multiple French benchmarks while retaining their original English capabilities. We further show that strategic model merging enhances performance in both languages, establishing Luth as a new state of the art for French SLMs and a robust baseline for future French-language research.
Anthology ID:
2026.eacl-srw.5
Volume:
Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 4: Student Research Workshop)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Selene Baez Santamaria, Sai Ashish Somayajula, Atsuki Yamaguchi
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
48–59
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-srw.5/
DOI:
Bibkey:
Cite (ACL):
Maxence Lasbordes and Sinoué Gad. 2026. Luth: Efficient French Specialization for Small Language Models and Cross-Lingual Transfer. In Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 4: Student Research Workshop), pages 48–59, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Luth: Efficient French Specialization for Small Language Models and Cross-Lingual Transfer (Lasbordes & Gad, EACL 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-srw.5.pdf