Compositional Task-Oriented Parsing as Abstractive Question Answering

Wenting Zhao, Konstantine Arkoudas, Weiqi Sun, Claire Cardie


Abstract
Task-oriented parsing (TOP) aims to convert natural language into machine-readable representations of specific tasks, such as setting an alarm. A popular approach to TOP is to apply seq2seq models to generate linearized parse trees. A more recent line of work argues that pretrained seq2seq2 models are better at generating outputs that are themselves natural language, so they replace linearized parse trees with canonical natural-language paraphrases that can then be easily translated into parse trees, resulting in so-called naturalized parsers. In this work we continue to explore naturalized semantic parsing by presenting a general reduction of TOP to abstractive question answering that overcomes some limitations of canonical paraphrasing. Experimental results show that our QA-based technique outperforms state-of-the-art methods in full-data settings while achieving dramatic improvements in few-shot settings.
Anthology ID:
2022.naacl-main.328
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4418–4427
Language:
URL:
https://aclanthology.org/2022.naacl-main.328
DOI:
10.18653/v1/2022.naacl-main.328
Bibkey:
Cite (ACL):
Wenting Zhao, Konstantine Arkoudas, Weiqi Sun, and Claire Cardie. 2022. Compositional Task-Oriented Parsing as Abstractive Question Answering. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4418–4427, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Compositional Task-Oriented Parsing as Abstractive Question Answering (Zhao et al., NAACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2022.naacl-main.328.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-2/2022.naacl-main.328.mp4
Code
 amazon-research/semantic-parsing-as-abstractive-qa
Data
TOPv2