Enhancing Text-to-SQL with Question Classification and Multi-Agent Collaboration

Zhihui Shao, Shubin Cai, Rongsheng Lin, Zhong Ming


Abstract
Large Language Models (LLMs) have recently demonstrated remarkable performance in Text-to-SQL tasks. However, existing research primarily focuses on the optimization of prompts and improvements in workflow, with few studies delving into the exploration of the questions. In this paper, we propose a Text-to-SQL framework based on question classification and multi-agent collaboration (QCMA-SQL). Specifically, we first employ multiple cross-attention mechanisms to train a schema selector to classify questions and select the most suitable database schema. Subsequently, we employ the appropriate agents based on the varying difficulty levels of the questions to generate preliminary SQL queries. Moreover, we implement syntax validation and execution optimization steps to generate final SQL queries. Experimental results on the Spider dataset show that the QCMA-SQL framework achieves an execution accuracy of 87.4%, outperforming state-of-the-art methods. Through ablation studies, we find that classifying the questions ultimately leads to a 2.8% increase in execution accuracy.
Anthology ID:
2025.findings-naacl.245
Volume:
Findings of the Association for Computational Linguistics: NAACL 2025
Month:
April
Year:
2025
Address:
Albuquerque, New Mexico
Editors:
Luis Chiruzzo, Alan Ritter, Lu Wang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4340–4349
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2025.findings-naacl.245/
DOI:
Bibkey:
Cite (ACL):
Zhihui Shao, Shubin Cai, Rongsheng Lin, and Zhong Ming. 2025. Enhancing Text-to-SQL with Question Classification and Multi-Agent Collaboration. In Findings of the Association for Computational Linguistics: NAACL 2025, pages 4340–4349, Albuquerque, New Mexico. Association for Computational Linguistics.
Cite (Informal):
Enhancing Text-to-SQL with Question Classification and Multi-Agent Collaboration (Shao et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2025.findings-naacl.245.pdf