DSMR-SQL: Enhancing Text-to-SQL with Dual-Strategy SQL Generation and Multi-Role SQL Selection

Yiming Huang, Jiyu Guo, Jichuan Zeng, Cuiyun Gao, Peiyi Han, Chuanyi Liu


Abstract
"Recent advancements in Large Language Models (LLMs) have markedly improved SQL generation. Nevertheless, existing approaches typically rely on single-model designs, limiting their capacity to effectively handle complex user queries. In addition, current methods often face difficulties in selecting the optimal SQL from multiple candidates. To mitigate these limitations,this study presents DSMR-SQL, a two-stage framework consisting of: (1) Dual-Strategy SQLGeneration: DSMR-SQL aims to produce a broader spectrum of SQL queries by using multiple models with two strategies: Supervised Fine-Tuning and In-Context Learning; (2) Multi-RoleSQL Selection: DSMR-SQL seeks to identify the SQL most aligning with user intent by introducing a collaborative framework involving three roles (i.e., Proposer, Critic, Summarizer).Extensive experiments on various datasets substantiate the efficacy of DSMR-SQL in enhancing SQL generation."
Anthology ID:
2025.ccl-1.86
Volume:
Proceedings of the 24th China National Conference on Computational Linguistics (CCL 2025)
Month:
August
Year:
2025
Address:
Jinan, China
Editors:
Maosong Sun, Peiyong Duan, Zhiyuan Liu, Ruifeng Xu, Weiwei Sun
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
1148–1177
Language:
URL:
https://preview.aclanthology.org/ingest-ccl/2025.ccl-1.86/
DOI:
Bibkey:
Cite (ACL):
Yiming Huang, Jiyu Guo, Jichuan Zeng, Cuiyun Gao, Peiyi Han, and Chuanyi Liu. 2025. DSMR-SQL: Enhancing Text-to-SQL with Dual-Strategy SQL Generation and Multi-Role SQL Selection. In Proceedings of the 24th China National Conference on Computational Linguistics (CCL 2025), pages 1148–1177, Jinan, China. Chinese Information Processing Society of China.
Cite (Informal):
DSMR-SQL: Enhancing Text-to-SQL with Dual-Strategy SQL Generation and Multi-Role SQL Selection (Huang et al., CCL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-ccl/2025.ccl-1.86.pdf