CoMoE: Contrastive Representation for Mixture-of-Experts in Parameter-Efficient Fine-tuning

Jinyuan Feng, ChaoPeng Wei, Tenghai Qiu, Tianyi Hu, Zhiqiang Pu


Abstract
In parameter-efficient fine-tuning, mixture-of-experts (MoE), which involves specializing functionalities into different experts and sparsely activating them appropriately, has been widely adopted as a promising approach to trade-off between model capacity and computation overhead. However, current MoE variants fall short on heterogeneous datasets, ignoring the fact that experts may learn similar knowledge, resulting in the underutilization of MoE’s capacity. In this paper, we propose Contrastive Representation for MoE (CoMoE), a novel method to promote modularization and specialization in MoE, where the experts are trained along with a contrastive objective by sampling from activated and inactivated experts in top-k routing. We demonstrate that such a contrastive objective recovers the mutual-information gap between inputs and the two types of experts. Experiments on several benchmarks and in multi-task settings demonstrate that CoMoE can consistently enhance MoE’s capacity and promote modularization among the experts.
Anthology ID:
2025.findings-emnlp.398
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7533–7551
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.398/
DOI:
10.18653/v1/2025.findings-emnlp.398
Bibkey:
Cite (ACL):
Jinyuan Feng, ChaoPeng Wei, Tenghai Qiu, Tianyi Hu, and Zhiqiang Pu. 2025. CoMoE: Contrastive Representation for Mixture-of-Experts in Parameter-Efficient Fine-tuning. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 7533–7551, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
CoMoE: Contrastive Representation for Mixture-of-Experts in Parameter-Efficient Fine-tuning (Feng et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.398.pdf
Checklist:
 2025.findings-emnlp.398.checklist.pdf