Forest for the Trees: Overarching Prompting Evokes High-Level Reasoning in Large Language Models

Haoran Liao, Shaohua Hu, Zhihao Zhu, Hao He, Yaohui Jin


Abstract
Chain-of-thought (CoT) and subsequent methods adopted a deductive paradigm that decomposes the reasoning process, demonstrating remarkable performances across NLP tasks. However, such a paradigm faces the challenge of getting bogged down in low-level semantic details, hindering large language models (LLMs) from correctly understanding, selecting, and compositing conditions. In this work, we present Overarching Prompting (OaP), a simple prompting method that elicits the high-level thinking of LLMs. Specifically, OaP first abstracts the whole problem into a simplified archetype and formulates strategies grounded in concepts and principles, establishing an overarching perspective for guiding reasoning. We conducted experiments with SoTA models, including ChatGPT, InstructGPT, and Llama3-70B-instruct, and received promising performances across tasks including Knowledge QA, Mathematical, and Open-Domain Reasoning. For instance, OaP improved ChatGPT and CoT by 19.0% and 3.1% on MMLU’s College Physics, 8.8% and 2.3% on GSM8k, and 10.3% and 2.5% on StrategyQA, respectively.
Anthology ID:
2025.naacl-long.66
Volume:
Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
April
Year:
2025
Address:
Albuquerque, New Mexico
Editors:
Luis Chiruzzo, Alan Ritter, Lu Wang
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1433–1453
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.naacl-long.66/
DOI:
Bibkey:
Cite (ACL):
Haoran Liao, Shaohua Hu, Zhihao Zhu, Hao He, and Yaohui Jin. 2025. Forest for the Trees: Overarching Prompting Evokes High-Level Reasoning in Large Language Models. In Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 1433–1453, Albuquerque, New Mexico. Association for Computational Linguistics.
Cite (Informal):
Forest for the Trees: Overarching Prompting Evokes High-Level Reasoning in Large Language Models (Liao et al., NAACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.naacl-long.66.pdf