ClozeMath: Improving Mathematical Reasoning in Language Models by Learning to Fill Equations

Quang Hieu Pham, Thuy Duong Nguyen, Tung Pham, Anh Tuan Luu, Dat Quoc Nguyen


Abstract
The capabilities of large language models (LLMs) have been enhanced by training on data that reflects human thought processes, such as the Chain-of-Thought format. However, evidence suggests that the conventional scheme of next-word prediction may not fully capture how humans learn to think. Inspired by how humans generalize mathematical reasoning, we propose a new approach named ClozeMath to fine-tune LLMs for mathematical reasoning. Our ClozeMath involves a text-infilling task that predicts masked equations from a given solution, analogous to cloze exercises used in human learning. Experiments on GSM8K, MATH, and GSM-Symbolic show that ClozeMath surpasses the strong baseline Masked Thought in performance and robustness, with two test-time scaling decoding algorithms, Beam Search and Chain-of-Thought decoding. Additionally, we conduct an ablation study to analyze the effects of various architectural and implementation choices on our approach.
Anthology ID:
2025.findings-acl.738
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venues:
Findings | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14322–14329
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.findings-acl.738/
DOI:
Bibkey:
Cite (ACL):
Quang Hieu Pham, Thuy Duong Nguyen, Tung Pham, Anh Tuan Luu, and Dat Quoc Nguyen. 2025. ClozeMath: Improving Mathematical Reasoning in Language Models by Learning to Fill Equations. In Findings of the Association for Computational Linguistics: ACL 2025, pages 14322–14329, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
ClozeMath: Improving Mathematical Reasoning in Language Models by Learning to Fill Equations (Pham et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.findings-acl.738.pdf