TagRouter: Learning Route to LLMs through Tags for Open-Domain Text Generation Tasks

Zhou Chen, Zhiqiang Wei, Yuqi Bai, Xue Xiong, Jianmin Wu


Abstract
Model routing allocates queries to the suitable model, improving system performance while reducing costs. However, existing routing methods face practical limitations that hinder scalability in large-scale applications and struggle to keep up with the rapid growth of the large language model (LLM) ecosystem. To tackle these challenges, we propose TagRouter, a training-free model routing method designed to optimize the synergy among multiple LLMs for open-domain text generation tasks. Experimental results demonstrate that TagRouter outperforms 13 baseline methods, increasing the accept rate of system by 6.15% and reducing costs by 17.20%, achieving optimal cost-efficiency. Our findings provides the LLM community with an efficient and scalable solution for model ensembling, offering users an evolvable “super model.”
Anthology ID:
2025.findings-acl.1110
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
21539–21564
Language:
URL:
https://preview.aclanthology.org/corrections-2025-08/2025.findings-acl.1110/
DOI:
10.18653/v1/2025.findings-acl.1110
Bibkey:
Cite (ACL):
Zhou Chen, Zhiqiang Wei, Yuqi Bai, Xue Xiong, and Jianmin Wu. 2025. TagRouter: Learning Route to LLMs through Tags for Open-Domain Text Generation Tasks. In Findings of the Association for Computational Linguistics: ACL 2025, pages 21539–21564, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
TagRouter: Learning Route to LLMs through Tags for Open-Domain Text Generation Tasks (Chen et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/corrections-2025-08/2025.findings-acl.1110.pdf