ThinkSwitcher: When to Think Hard, When to Think Fast

Guosheng Liang, Longguang Zhong, Ziyi Yang, Xiaojun Quan


Abstract
Large reasoning models (LRMs) excel at solving complex tasks by leveraging long chain-of-thought (CoT) reasoning. However, this often leads to overthinking on simple tasks, resulting in unnecessary computational overhead. We observe that LRMs inherently possess the capability for efficient short CoT reasoning, which can be reliably elicited through prompt design. To leverage this capability, we propose ThinkSwitcher, a framework that enables a single LRM to dynamically switch between short and long CoT modes based on task complexity. ThinkSwitcher introduces a lightweight switching module trained with supervision signals derived from the relative performance of each reasoning mode across tasks. Experiments on multiple reasoning benchmarks show that ThinkSwitcher reduces computational cost by 20-30% while maintaining high accuracy on complex tasks. This demonstrates the effectiveness of ThinkSwitcher as a scalable and efficient solution for unified LRM deployment.
Anthology ID:
2025.findings-emnlp.278
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5185–5201
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.278/
DOI:
10.18653/v1/2025.findings-emnlp.278
Bibkey:
Cite (ACL):
Guosheng Liang, Longguang Zhong, Ziyi Yang, and Xiaojun Quan. 2025. ThinkSwitcher: When to Think Hard, When to Think Fast. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 5185–5201, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
ThinkSwitcher: When to Think Hard, When to Think Fast (Liang et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.278.pdf
Checklist:
 2025.findings-emnlp.278.checklist.pdf