Syntax Matters: Towards Spoken Language Understanding via Syntax-Aware Attention

Yifeng Xie, Zhihong Zhu, Xuxin Cheng, Zhiqi Huang, Dongsheng Chen


Abstract
Spoken Language Understanding (SLU), a crucial component of task-oriented dialogue systems, has consistently garnered attention from both academic and industrial communities. Although incorporating syntactic information into models has the potential to enhance the comprehension of user utterances and yield impressive results, its application in SLU systems remains largely unexplored. In this paper, we propose a carefully designed model termed Syntax-aware attention (SAT) to enhance SLU, where attention scopes are constrained based on relationships within the syntactic structure. Experimental results on three datasets show that our model achieves substantial improvements and excellent performance. Moreover, SAT can be integrated into other BERT-based language models to further boost their performance.
Anthology ID:
2023.findings-emnlp.794
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11858–11864
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.794
DOI:
10.18653/v1/2023.findings-emnlp.794
Bibkey:
Cite (ACL):
Yifeng Xie, Zhihong Zhu, Xuxin Cheng, Zhiqi Huang, and Dongsheng Chen. 2023. Syntax Matters: Towards Spoken Language Understanding via Syntax-Aware Attention. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 11858–11864, Singapore. Association for Computational Linguistics.
Cite (Informal):
Syntax Matters: Towards Spoken Language Understanding via Syntax-Aware Attention (Xie et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2023.findings-emnlp.794.pdf