SymBa: Symbolic Backward Chaining for Structured Natural Language Reasoning

Jinu Lee, Wonseok Hwang


Abstract
To improve the performance and explainability of LLM-based natural language reasoning, structured reasoning can be applied to generate explicitly structured proofs. Among different methods for structured reasoning, we specifically focus on backward chaining, where the proof goal is recursively decomposed to subgoals by searching and applying rules. We argue that current LLM-based backward chaining systems (e.g. Least-to-most prompting and LAMBADA) are incomplete, as they omit crucial algorithmic components identified from the classic backward chaining algorithm in computational logic (SLD Resolution). To this end, we propose a novel backward chaining system, SymBa (Symbolic Backward Chaining), which integrates a symbolic solver and an LLM. In SymBa, the solver controls the proof process, and the LLM is only called when the solver requires new information to complete the proof. Empowered by completeness, SymBa achieves a significant improvement in seven deductive, relational, and arithmetic reasoning benchmarks compared to the baselines.
Anthology ID:
2025.naacl-long.124
Volume:
Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
April
Year:
2025
Address:
Albuquerque, New Mexico
Editors:
Luis Chiruzzo, Alan Ritter, Lu Wang
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2468–2484
Language:
URL:
https://preview.aclanthology.org/corrections-2025-06/2025.naacl-long.124/
DOI:
10.18653/v1/2025.naacl-long.124
Bibkey:
Cite (ACL):
Jinu Lee and Wonseok Hwang. 2025. SymBa: Symbolic Backward Chaining for Structured Natural Language Reasoning. In Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 2468–2484, Albuquerque, New Mexico. Association for Computational Linguistics.
Cite (Informal):
SymBa: Symbolic Backward Chaining for Structured Natural Language Reasoning (Lee & Hwang, NAACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/corrections-2025-06/2025.naacl-long.124.pdf