Advancing Reasoning with Off-the-Shelf LLMs: A Semantic Structure Perspective

Pengfei He, Zitao Li, Yue Xing, Yaliang Li, Jiliang Tang, Bolin Ding


Abstract
Large Language Models (LLMs) have shown strong capabilities in zero-shot reasoning and generalization to new tasks. However, the zero-shot performance of general LLMs on complex tasks, such as multi-hop reasoning, remains suboptimal, while reasoning LLMs suffer from hallucinations and unfaithfulness. In this paper, to handle these limitations, we introduce a novel structure analysis method that helps LLMs better understand the question structure and guide the problem-solving process. We demonstrate that existing reasoning strategies, such as Chain-of-Thought and ReAct, significantly benefit from the LLM’s inherent understanding of semantic structure. We further ground our method in the theory of probabilistic graphical models to support its effectiveness. To enhance the reasoning process, we augment the structure analysis with refinement and retrieval capabilities, forming a multi-agent reasoning system called Structure-oriented Autonomous Reasoning Agents (SARA). Extensive experiments show that SARA significantly improves zero-shot performance on knowledge-intensive and mathematical tasks. Remarkably, our approach makes a general LLM competitive with dedicated reasoning models in several benchmarks and demonstrates strong robustness against corrupted reasoning paths.
Anthology ID:
2025.findings-emnlp.137
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2538–2566
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.137/
DOI:
10.18653/v1/2025.findings-emnlp.137
Bibkey:
Cite (ACL):
Pengfei He, Zitao Li, Yue Xing, Yaliang Li, Jiliang Tang, and Bolin Ding. 2025. Advancing Reasoning with Off-the-Shelf LLMs: A Semantic Structure Perspective. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 2538–2566, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Advancing Reasoning with Off-the-Shelf LLMs: A Semantic Structure Perspective (He et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.137.pdf
Checklist:
 2025.findings-emnlp.137.checklist.pdf