Lihan Wang
2022
SUN: Exploring Intrinsic Uncertainties in Text-to-SQL Parsers
Bowen Qin
|
Lihan Wang
|
Binyuan Hui
|
Bowen Li
|
Xiangpeng Wei
|
Binhua Li
|
Fei Huang
|
Luo Si
|
Min Yang
|
Yongbin Li
Proceedings of the 29th International Conference on Computational Linguistics
This paper aims to improve the performance of text-to-SQL parsing by exploring the intrinsic uncertainties in the neural network based approaches (called SUN). From the data uncertainty perspective, it is indisputable that a single SQL can be learned from multiple semantically-equivalent questions. Different from previous methods that are limited to one-to-one mapping, we propose a data uncertainty constraint to explore the underlying complementary semantic information among multiple semantically-equivalent questions (many-to-one) and learn the robust feature representations with reduced spurious associations. In this way, we can reduce the sensitivity of the learned representations and improve the robustness of the parser. From the model uncertainty perspective, there is often structural information (dependence) among the weights of neural networks. To improve the generalizability and stability of neural text-to-SQL parsers, we propose a model uncertainty constraint to refine the query representations by enforcing the output representations of different perturbed encoding networks to be consistent with each other. Extensive experiments on five benchmark datasets demonstrate that our method significantly outperforms strong competitors and achieves new state-of-the-art results.
S2SQL: Injecting Syntax to Question-Schema Interaction Graph Encoder for Text-to-SQL Parsers
Binyuan Hui
|
Ruiying Geng
|
Lihan Wang
|
Bowen Qin
|
Yanyang Li
|
Bowen Li
|
Jian Sun
|
Yongbin Li
Findings of the Association for Computational Linguistics: ACL 2022
The task of converting a natural language question into an executable SQL query, known as text-to-SQL, is an important branch of semantic parsing. The state-of-the-art graph-based encoder has been successfully used in this task but does not model the question syntax well. In this paper, we propose S2SQL, injecting Syntax to question-Schema graph encoder for Text-to-SQL parsers, which effectively leverages the syntactic dependency information of questions in text-to-SQL to improve the performance. We also employ the decoupling constraint to induce diverse relational edge embedding, which further improves the network’s performance. Experiments on the Spider and robustness setting Spider-Syn demonstrate that the proposed approach outperforms all existing methods when pre-training models are used, resulting in a performance ranks first on the Spider leaderboard.
Search
Co-authors
- Bowen Qin 2
- Binyuan Hui 2
- Bowen Li 2
- Yongbin Li 2
- Xiangpeng Wei 1
- show all...