@inproceedings{qiu-etal-2025-superpose,
    title = "Superpose Task-specific Features for Model Merging",
    author = "Qiu, Haiquan  and
      Wu, You  and
      Li, Dong  and
      Guo, Jianmin  and
      Yao, Quanming",
    editor = "Christodoulopoulos, Christos  and
      Chakraborty, Tanmoy  and
      Rose, Carolyn  and
      Peng, Violet",
    booktitle = "Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing",
    month = nov,
    year = "2025",
    address = "Suzhou, China",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.210/",
    pages = "4200--4214",
    ISBN = "979-8-89176-332-6",
    abstract = "Model merging enables powerful capabilities in neural networks without requiring additional training. In this paper, we introduce a novel perspective on model merging by leveraging the fundamental mechanisms of neural network representation. Our approach is motivated by the linear representation hypothesis, which states that neural networks encode information through linear combinations of feature vectors. We propose a method that superposes task-specific features from individual models into a merged model. Our approach specifically targets linear transformation matrices, which are crucial for feature activation and extraction in deep networks. By formulating the merging process as a linear system, we can preserve output feature directions from individual models and create merged models that effectively maintain multi-task capabilities compared to existing methods. Extensive experiments across diverse benchmarks and models demonstrate that our method outperforms existing techniques."
}Markdown (Informal)
[Superpose Task-specific Features for Model Merging](https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.210/) (Qiu et al., EMNLP 2025)
ACL
- Haiquan Qiu, You Wu, Dong Li, Jianmin Guo, and Quanming Yao. 2025. Superpose Task-specific Features for Model Merging. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 4200–4214, Suzhou, China. Association for Computational Linguistics.