Abstract
Utilizing large language models (LLMs) for data augmentation has yielded encouraging results in mathematical reasoning. However, these approaches face constraints in problem diversity, potentially restricting them to in-domain/distribution data generation. To this end, we propose **ControlMath**, an iterative method involving an equation-generator module and two LLM-based agents. The module creates diverse equations, which the Problem-Crafter agent then transforms into math word problems. The Reverse-Agent filters and selects high-quality data, adhering to the “less is more” principle. This approach enables the generation of diverse math problems, not limited to specific domains or distributions. As a result, we collect ControlMathQA, which involves 190k math word problems. Extensive results prove that combining our dataset with in-domain datasets like GSM8K can help improve the model’s mathematical ability to generalize, leading to improved performance both within and beyond specific domains.- Anthology ID:
- 2024.emnlp-main.680
- Volume:
- Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
- Month:
- November
- Year:
- 2024
- Address:
- Miami, Florida, USA
- Editors:
- Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 12201–12217
- Language:
- URL:
- https://aclanthology.org/2024.emnlp-main.680
- DOI:
- 10.18653/v1/2024.emnlp-main.680
- Cite (ACL):
- Nuo Chen, Ning Wu, Jianhui Chang, Linjun Shou, and Jia Li. 2024. ControlMath: Controllable Data Generation Promotes Math Generalist Models. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 12201–12217, Miami, Florida, USA. Association for Computational Linguistics.
- Cite (Informal):
- ControlMath: Controllable Data Generation Promotes Math Generalist Models (Chen et al., EMNLP 2024)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/2024.emnlp-main.680.pdf