TIPA: Typologically Informed Parameter Aggregation

Stef Accou, Wessel Poelman


Abstract
Massively multilingual language models enable cross-lingual generalization but underperform on low-resource and unseen languages. While adapter-based fine-tuning offers a parameter-efficient solution, training language-specific adapters at scale remains costly. We introduce Typologically Informed Parameter Aggregation (TIPA), a training-free framework that constructs proxy language adapters by aggregating existing ones, weighted by typological similarity. Integrated into the MAD-X architecture, these proxies enable zero-shot cross-lingual transfer without additional training. We evaluate TIPA on five NLP tasks and over 230 languages. TIPA consistently outperforms baselines such as English-only fine-tuning and selecting the typologically closest-language adapter, with the largest gains for languages lacking dedicated adapters. Our results demonstrate that typologically informed aggregation provides a viable alternative to language-specific modules without any training needed.
Anthology ID:
2026.findings-eacl.119
Volume:
Findings of the Association for Computational Linguistics: EACL 2026
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2253–2267
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.119/
DOI:
Bibkey:
Cite (ACL):
Stef Accou and Wessel Poelman. 2026. TIPA: Typologically Informed Parameter Aggregation. In Findings of the Association for Computational Linguistics: EACL 2026, pages 2253–2267, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
TIPA: Typologically Informed Parameter Aggregation (Accou & Poelman, Findings 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.119.pdf
Checklist:
 2026.findings-eacl.119.checklist.pdf