Zezheng Wang
2025
Dynamic Collaboration of Multi-Language Models based on Minimal Complete Semantic Units
Chao Hao
|
Zezheng Wang
|
Yanhua Huang
|
Ruiwen Xu
|
Wenzhe Niu
|
Xin Liu
|
Zitong Yu
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
This paper investigates the enhancement of reasoning capabilities in language models through token-level multi-model collaboration. Our approach selects the optimal tokens from the next token distributions provided by multiple models to perform autoregressive reasoning. Contrary to the assumption that more models yield better results, we introduce a distribution distance-based dynamic selection strategy (DDS) to optimize the multi-model collaboration process. To address the critical challenge of vocabulary misalignment in multi-model collaboration, we propose the concept of minimal complete semantic units (MCSU), which is simple yet enables multiple language models to achieve natural alignment within the linguistic space. Experimental results across various benchmarks demonstrate the superiority of our method. The codes will be released soon.
Search
Fix author
Co-authors
- Chao Hao 1
- Yanhua Huang 1
- Xin Liu (刘鑫) 1
- Wenzhe Niu 1
- Ruiwen Xu 1
- show all...