Online Semantic Parsing for Latency Reduction in Task-Oriented Dialogue

Jiawei Zhou, Jason Eisner, Michael Newman, Emmanouil Antonios Platanios, Sam Thomson


Abstract
Standard conversational semantic parsing maps a complete user utterance into an executable program, after which the program is executed to respond to the user. This could be slow when the program contains expensive function calls. We investigate the opportunity to reduce latency by predicting and executing function calls while the user is still speaking. We introduce the task of online semantic parsing for this purpose, with a formal latency reduction metric inspired by simultaneous machine translation. We propose a general framework with first a learned prefix-to-program prediction module, and then a simple yet effective thresholding heuristic for subprogram selection for early execution. Experiments on the SMCalFlow and TreeDST datasets show our approach achieves large latency reduction with good parsing quality, with a 30%–65% latency reduction depending on function execution time and allowed cost.
Anthology ID:
2022.acl-long.110
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1554–1576
Language:
URL:
https://aclanthology.org/2022.acl-long.110
DOI:
10.18653/v1/2022.acl-long.110
Award:
 Outstanding Paper
Bibkey:
Cite (ACL):
Jiawei Zhou, Jason Eisner, Michael Newman, Emmanouil Antonios Platanios, and Sam Thomson. 2022. Online Semantic Parsing for Latency Reduction in Task-Oriented Dialogue. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1554–1576, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Online Semantic Parsing for Latency Reduction in Task-Oriented Dialogue (Zhou et al., ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2022.acl-long.110.pdf
Video:
 https://preview.aclanthology.org/improve-issue-templates/2022.acl-long.110.mp4