Procrustes Analysis for Improving Language Model Merging

Olivier Ferret


Abstract
The availability of many fine-tuned neural language models for different tasks naturally leads to the question of whether it is worthwhile to combine them, particularly through parameter merging, which is the least resource-intensive option. Among the many existing methods, some focus on parameter alignment before actual merging. In this article, we propose a new method within this research area, based on Procrustes analysis. We evaluate this method for merging fine-tuned models for the same task, derived from the same encoder-based model. Considering nine tasks from the GLUE benchmark, three Named Entity Recognition tasks, and six reference merging methods, we show that our proposal can improve upon existing merging methods in most tested configurations.
Anthology ID:
2026.lrec-main.783
Volume:
Proceedings of the Fifteenth Language Resources and Evaluation Conference
Month:
May
Year:
2026
Address:
Palma de Mallorca, Spain
Editors:
Stelios Piperidis, Núria Bel, Henk van den Heuvel, Nancy Ide, Simon Krek, Antonio Toral
Venue:
LREC
SIG:
Publisher:
ELRA Language Resource Association
Note:
Pages:
9988–9998
Language:
URL:
https://preview.aclanthology.org/ingest-lrec/2026.lrec-main.783/
DOI:
Bibkey:
Cite (ACL):
Olivier Ferret. 2026. Procrustes Analysis for Improving Language Model Merging. International Conference on Language Resources and Evaluation, main:9988–9998.
Cite (Informal):
Procrustes Analysis for Improving Language Model Merging (Ferret, LREC 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-lrec/2026.lrec-main.783.pdf