Linghao Zhu
2025
Theorem-Validated Reverse Chain-of-Thought Problem Generation for Geometric Reasoning
Deng Linger
|
Linghao Zhu
|
Yuliang Liu
|
Yu Wang
|
Qunyi Xie
|
Jingjing Wu
|
Gang Zhang
|
Yingying Zhu
|
Xiang Bai
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Large Multimodal Models (LMMs) face limitations in geometric reasoning due to insufficient Chain of Thought (CoT) image-text training data. While existing approaches leverage template-based or LLM-assisted methods for geometric CoT data creation, they often face challenges in achieving both diversity and precision. To bridge this gap, we introduce a two-stage Theorem-Validated Reverse Chain-of-Thought Reasoning Synthesis (TR-CoT) framework. The first stage, TR-Engine, synthesizes theorem-grounded geometric diagrams with structured descriptions and properties. The second stage, TR-Reasoner, employs reverse reasoning to iteratively refine question-answer pairs by cross-validating geometric properties and description fragments. Our approach expands theorem-type coverage, corrects long-standing misunderstandings, and enhances geometric reasoning. Fine-grained CoT improves theorem understanding and increases logical consistency by 24.5%. Our best models surpass the baselines in MathVista and GeoQA by 10.1% and 4.7%, outperforming advanced closed-source models like GPT-4o.
Search
Fix author
Co-authors
- Xiang Bai 1
- Deng Linger 1
- Yuliang Liu 1
- Yu Wang (王昱, 王雨) 1
- Jingjing Wu 1
- show all...