Yonghoon Kwon
2025
CAC-CoT: Connector-Aware Compact Chain-of-Thought for Efficient Reasoning Data Synthesis Across Dual-System Cognitive Tasks
Sunguk Choi
|
Yonghoon Kwon
|
Heondeuk Lee
Findings of the Association for Computational Linguistics: EMNLP 2025
Long chain-of-thought (CoT) prompting helps Large Language Models (LLMs) solve difficult problems, but very long traces often slow or even degrade performance on fast, intuitive “System-1” tasks. We introduce Connector-Aware Compact CoT (CAC-CoT) — a method that deliberately restricts reasoning to a small, fixed set of connector phrases, steering the model toward concise and well — structured explanations. Despite its simplicity, our synthetic method with general-purpose LLMs yields a high-quality training quality. CAC-CoT achieves ≈ 85% on GSM8K and ≈ 40% on GPQA (System-2) while also achieving ≈ 85% on S1-Bench (System-1), surpassing the baseline by over 20%. Its reasoning traces average ≈ 300 tokens(ART), about one-third the length of baseline traces, delivering higher efficiency without loss of accuracy.