Superpose Task-specific Features for Model Merging

Haiquan Qiu, You Wu, Dong Li, Jianmin Guo, Quanming Yao


Abstract
Model merging enables powerful capabilities in neural networks without requiring additional training. In this paper, we introduce a novel perspective on model merging by leveraging the fundamental mechanisms of neural network representation. Our approach is motivated by the linear representation hypothesis, which states that neural networks encode information through linear combinations of feature vectors. We propose a method that superposes task-specific features from individual models into a merged model. Our approach specifically targets linear transformation matrices, which are crucial for feature activation and extraction in deep networks. By formulating the merging process as a linear system, we can preserve output feature directions from individual models and create merged models that effectively maintain multi-task capabilities compared to existing methods. Extensive experiments across diverse benchmarks and models demonstrate that our method outperforms existing techniques.
Anthology ID:
2025.emnlp-main.210
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4200–4214
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.210/
DOI:
Bibkey:
Cite (ACL):
Haiquan Qiu, You Wu, Dong Li, Jianmin Guo, and Quanming Yao. 2025. Superpose Task-specific Features for Model Merging. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 4200–4214, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Superpose Task-specific Features for Model Merging (Qiu et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.210.pdf
Checklist:
 2025.emnlp-main.210.checklist.pdf