Answer Convergence as a Signal for Early Stopping in Reasoning

Xin Liu, Lu Wang


Abstract
Chain-of-thought (CoT) prompting enhances reasoning in large language models (LLMs) but often leads to verbose and redundant outputs, thus increasing inference cost. We hypothesize that many reasoning steps are unnecessary for producing correct answers. To investigate this, we start with a systematic study to investigate what is the minimum reasoning required for a model to reach a stable decision. Based on the insights, we propose three inference-time strategies to improve efficiency: (1) early stopping via answer consistency, (2) boosting the probability of generating end-of-reasoning signals, and (3) a supervised method that learns when to stop based on internal activations. Experiments across five benchmarks and five open-weights LLMs show that our methods largely reduce token usage with little or no accuracy drop. In particular, on NaturalQuestions, Answer Consistency reduces tokens by over 40% while further improving accuracy. Our work underscores the importance of cost-effective reasoning methods that operate at inference time, offering practical benefits for real-world applications.
Anthology ID:
2025.emnlp-main.904
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
17907–17918
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.904/
DOI:
Bibkey:
Cite (ACL):
Xin Liu and Lu Wang. 2025. Answer Convergence as a Signal for Early Stopping in Reasoning. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 17907–17918, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Answer Convergence as a Signal for Early Stopping in Reasoning (Liu & Wang, EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.904.pdf
Checklist:
 2025.emnlp-main.904.checklist.pdf