Grammar-Constrained Decoding Makes Large Language Models Better Logical Parsers

Federico Raspanti, Tanir Ozcelebi, Mike Holenderski


Abstract
Large Language Models (LLMs) have shown capabilities in various natural language processing tasks, yet they often struggle with logical reasoning, particularly when dealing with complex natural language statements. To address this challenge, approaches that combine LLMs with symbolic reasoners have been proposed, where the LLM translates the natural language statements into symbolic representations, which are then verified by an external symbolic solver. However, ensuring syntactic correctness in these translations remains a significant challenge. To address this, we propose to constrain the outputs of the LLMs using Grammar-Constrained Decoding, showing that it consistently improves both syntactic correctness and semantic accuracy in logical parsing tasks. Our findings suggest that grammar constraints can serve as an effective substitute for in-context examples, especially beneficial for resource-constrained applications using smaller models.
Anthology ID:
2025.acl-industry.34
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 6: Industry Track)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Georg Rehm, Yunyao Li
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
485–499
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.acl-industry.34/
DOI:
Bibkey:
Cite (ACL):
Federico Raspanti, Tanir Ozcelebi, and Mike Holenderski. 2025. Grammar-Constrained Decoding Makes Large Language Models Better Logical Parsers. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 6: Industry Track), pages 485–499, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Grammar-Constrained Decoding Makes Large Language Models Better Logical Parsers (Raspanti et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.acl-industry.34.pdf