Deploying Multi-task Online Server with Large Language Model
Yincen Qu, Hengyue Liu, Kun Wang, Xiangying Dai, Xiaoou Lu, Hui Zhou, Chao Ma
Abstract
In the industry, numerous tasks are deployed online. Traditional approaches often tackle each task separately by its own network, which leads to excessive costs for developing and scaling models, especially in the context of large language models. Although multi-task methods can save costs through parameter sharing, they often struggle to outperform single-task methods in real-world applications. To tackle these challenges, we present a three-stage multi-task learning framework for large language models. It involves task filtering, followed by fine-tuning on high-resource tasks, and finally fine-tuning on all tasks. We conducted comprehensive experiments in single-task and multi-task settings. Our approach, exemplified on different benchmarks, demonstrates that it is able to achieve performance comparable to the single-task method while reducing up to 90.9% of its overhead.- Anthology ID:
- 2025.coling-industry.41
- Volume:
- Proceedings of the 31st International Conference on Computational Linguistics: Industry Track
- Month:
- January
- Year:
- 2025
- Address:
- Abu Dhabi, UAE
- Editors:
- Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert, Kareem Darwish, Apoorv Agarwal
- Venue:
- COLING
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 483–495
- Language:
- URL:
- https://preview.aclanthology.org/jlcl-multiple-ingestion/2025.coling-industry.41/
- DOI:
- Cite (ACL):
- Yincen Qu, Hengyue Liu, Kun Wang, Xiangying Dai, Xiaoou Lu, Hui Zhou, and Chao Ma. 2025. Deploying Multi-task Online Server with Large Language Model. In Proceedings of the 31st International Conference on Computational Linguistics: Industry Track, pages 483–495, Abu Dhabi, UAE. Association for Computational Linguistics.
- Cite (Informal):
- Deploying Multi-task Online Server with Large Language Model (Qu et al., COLING 2025)
- PDF:
- https://preview.aclanthology.org/jlcl-multiple-ingestion/2025.coling-industry.41.pdf