Unsupervised Dual Paraphrasing for Two-stage Semantic Parsing

Ruisheng Cao, Su Zhu, Chenyu Yang, Chen Liu, Rao Ma, Yanbin Zhao, Lu Chen, Kai Yu


Abstract
One daunting problem for semantic parsing is the scarcity of annotation. Aiming to reduce nontrivial human labor, we propose a two-stage semantic parsing framework, where the first stage utilizes an unsupervised paraphrase model to convert an unlabeled natural language utterance into the canonical utterance. The downstream naive semantic parser accepts the intermediate output and returns the target logical form. Furthermore, the entire training process is split into two phases: pre-training and cycle learning. Three tailored self-supervised tasks are introduced throughout training to activate the unsupervised paraphrase model. Experimental results on benchmarks Overnight and GeoGranno demonstrate that our framework is effective and compatible with supervised training.
Anthology ID:
2020.acl-main.608
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6806–6817
Language:
URL:
https://aclanthology.org/2020.acl-main.608
DOI:
10.18653/v1/2020.acl-main.608
Bibkey:
Cite (ACL):
Ruisheng Cao, Su Zhu, Chenyu Yang, Chen Liu, Rao Ma, Yanbin Zhao, Lu Chen, and Kai Yu. 2020. Unsupervised Dual Paraphrasing for Two-stage Semantic Parsing. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 6806–6817, Online. Association for Computational Linguistics.
Cite (Informal):
Unsupervised Dual Paraphrasing for Two-stage Semantic Parsing (Cao et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2020.acl-main.608.pdf
Video:
 http://slideslive.com/38929390
Code
 rhythmcao/unsup-two-stage-semantic-parsing