LogicTree: Structured Proof Exploration for Coherent and Rigorous Logical Reasoning with Large Language Models

Kang He, Kaushik Roy


Abstract
Large language models (LLMs) have achieved remarkable multi-step reasoning capabilities across various domains. However, LLMs still face distinct challenges in complex logical reasoning, as (1) proof-finding requires systematic exploration and the maintenance of logical coherence and (2) searching the right combination of premises at each reasoning step is inherently challenging in tasks with large premise space. To address this, we propose LogicTree, an inference-time modular framework employing algorithm-guided search to automate structured proof exploration and ensure logical coherence. Advancing beyond tree-of-thought (ToT), we incorporate caching mechanism into LogicTree to enable effective utilization of historical knowledge, preventing reasoning stagnation and minimizing redundancy. Furthermore, we address the combinatorial complexity of premise search by decomposing it into a linear process. The refined premise selection restricts subsequent inference to at most one derivation per step, enhancing reasoning granularity and enforcing strict step-by-step reasoning. Additionally, we introduce two LLM-free heuristics for premise prioritization, enabling strategic proof search. Experimental results on five datasets demonstrate that LogicTree optimally scales inference-time computation to achieve higher proof accuracy, surpassing chain-of-thought (CoT) and ToT with average gains of 23.6% and 12.5%, respectively, on GPT-4o. Moreover, within LogicTree, GPT-4o outperforms o3-mini by 7.6% on average.
Anthology ID:
2025.emnlp-main.1054
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
20863–20892
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1054/
DOI:
Bibkey:
Cite (ACL):
Kang He and Kaushik Roy. 2025. LogicTree: Structured Proof Exploration for Coherent and Rigorous Logical Reasoning with Large Language Models. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 20863–20892, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
LogicTree: Structured Proof Exploration for Coherent and Rigorous Logical Reasoning with Large Language Models (He & Roy, EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1054.pdf
Checklist:
 2025.emnlp-main.1054.checklist.pdf