One More Question is Enough, Expert Question Decomposition (EQD) Model for Domain Quantitative Reasoning

Mengyu Wang, Sotirios Sabanis, Miguel de Carvalho, Shay B Cohen, Tiejun Ma


Abstract
Domain-specific quantitative reasoning remains a major challenge for large language models (LLMs), especially in fields requiring expert knowledge and complex question answering (QA). In this work, we propose Expert Question Decomposition (EQD), an approach designed to balance the use of domain knowledge with computational efficiency. EQD is built on a two-step fine-tuning framework and guided by a reward function that measures the effectiveness of generated sub-questions in improving QA outcomes. It requires only a few thousand training examples and a single A100 GPU for fine-tuning, with inference time comparable to zero-shot prompting. Beyond its efficiency, EQD outperforms state-of-the-art domain-tuned models and advanced prompting strategies. We evaluate EQD in the financial domain, characterized by specialized knowledge and complex quantitative reasoning, across four benchmark datasets. Our method consistently improves QA performance by 0.6% to 10.5% across different LLMs. Our analysis reveals an important insight: in domain-specific QA, a single supporting question often provides greater benefit than detailed guidance steps.
Anthology ID:
2025.findings-emnlp.1108
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
20355–20369
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1108/
DOI:
10.18653/v1/2025.findings-emnlp.1108
Bibkey:
Cite (ACL):
Mengyu Wang, Sotirios Sabanis, Miguel de Carvalho, Shay B Cohen, and Tiejun Ma. 2025. One More Question is Enough, Expert Question Decomposition (EQD) Model for Domain Quantitative Reasoning. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 20355–20369, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
One More Question is Enough, Expert Question Decomposition (EQD) Model for Domain Quantitative Reasoning (Wang et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1108.pdf
Checklist:
 2025.findings-emnlp.1108.checklist.pdf