Non-Autoregressive Semantic Parsing for Compositional Task-Oriented Dialog

Arun Babu, Akshat Shrivastava, Armen Aghajanyan, Ahmed Aly, Angela Fan, Marjan Ghazvininejad


Abstract
Semantic parsing using sequence-to-sequence models allows parsing of deeper representations compared to traditional word tagging based models. In spite of these advantages, widespread adoption of these models for real-time conversational use cases has been stymied by higher compute requirements and thus higher latency. In this work, we propose a non-autoregressive approach to predict semantic parse trees with an efficient seq2seq model architecture. By combining non-autoregressive prediction with convolutional neural networks, we achieve significant latency gains and parameter size reduction compared to traditional RNN models. Our novel architecture achieves up to an 81% reduction in latency on TOP dataset and retains competitive performance to non-pretrained models on three different semantic parsing datasets.
Anthology ID:
2021.naacl-main.236
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2969–2978
Language:
URL:
https://aclanthology.org/2021.naacl-main.236
DOI:
10.18653/v1/2021.naacl-main.236
Bibkey:
Cite (ACL):
Arun Babu, Akshat Shrivastava, Armen Aghajanyan, Ahmed Aly, Angela Fan, and Marjan Ghazvininejad. 2021. Non-Autoregressive Semantic Parsing for Compositional Task-Oriented Dialog. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 2969–2978, Online. Association for Computational Linguistics.
Cite (Informal):
Non-Autoregressive Semantic Parsing for Compositional Task-Oriented Dialog (Babu et al., NAACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.naacl-main.236.pdf
Video:
 https://preview.aclanthology.org/ingestion-script-update/2021.naacl-main.236.mp4
Code
 facebookresearch/pytext
Data
SNIPS