Sketch-of-Thought: Efficient LLM Reasoning with Adaptive Cognitive-Inspired Sketching

Simon A. Aytes, Jinheon Baek, Sung Ju Hwang


Abstract
Recent advances in large language models (LLMs) have enabled strong reasoning capabilities through Chain-of-Thought (CoT) prompting, which elicits step-by-step problem solving, but often at the cost of excessive verbosity in intermediate outputs, leading to increased computational overhead. We propose Sketch-of-Thought (SoT), a prompting framework that integrates cognitively inspired reasoning paradigms with linguistic constraints to reduce token usage while preserving reasoning accuracy. SoT is designed as a flexible, modular approach and is instantiated with three paradigms—Conceptual Chaining, Chunked Symbolism, and Expert Lexicons—each tailored to distinct reasoning tasks and selected dynamically at test-time by a lightweight routing model. Across 18 reasoning datasets spanning multiple domains, languages, and modalities, SoT achieves token reductions of up to 84% with minimal accuracy loss. In tasks such as mathematical and multi-hop reasoning, it even improves accuracy while shortening outputs.
Anthology ID:
2025.emnlp-main.1236
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
24307–24331
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1236/
DOI:
Bibkey:
Cite (ACL):
Simon A. Aytes, Jinheon Baek, and Sung Ju Hwang. 2025. Sketch-of-Thought: Efficient LLM Reasoning with Adaptive Cognitive-Inspired Sketching. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 24307–24331, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Sketch-of-Thought: Efficient LLM Reasoning with Adaptive Cognitive-Inspired Sketching (Aytes et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1236.pdf
Checklist:
 2025.emnlp-main.1236.checklist.pdf