NeuronMerge: Merging Models via Functional Neuron Groups

Wangyun Gu, Qianghua Gao, Zhang Li-Xin, Xu Shen, Jieping Ye


Abstract
Model merging techniques like task arithmetic, which combines model parameters through weighted averaging, have proven effective. However, the success of task arithmetic relies on the linearity between model weight differences and output feature changes, which is often lacking in conventional fine-tuned models. In this work, we employ neuron description methods to analyze and classify neurons based on their functionalities. We theoretically demonstrate that grouping Multi-Layer Perceptron (MLP) neurons by functionality enhances model linearity. Building on this, we propose a neuron-based task arithmetic merging method that consistently improves performance across various tasks and model scales. Our approach is complementary to existing merging techniques, achieving superior results in merging models fine-tuned on fundamental tasks like Math, Code and Translation.
Anthology ID:
2025.findings-acl.471
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9015–9037
Language:
URL:
https://preview.aclanthology.org/display_plenaries/2025.findings-acl.471/
DOI:
Bibkey:
Cite (ACL):
Wangyun Gu, Qianghua Gao, Zhang Li-Xin, Xu Shen, and Jieping Ye. 2025. NeuronMerge: Merging Models via Functional Neuron Groups. In Findings of the Association for Computational Linguistics: ACL 2025, pages 9015–9037, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
NeuronMerge: Merging Models via Functional Neuron Groups (Gu et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/display_plenaries/2025.findings-acl.471.pdf