Attention-based Semantic Priming for Slot-filling

Jiewen Wu, Rafael E. Banchs, Luis Fernando D’Haro, Pavitra Krishnaswamy, Nancy Chen

[How to correct problems with metadata yourself]


Abstract
The problem of sequence labelling in language understanding would benefit from approaches inspired by semantic priming phenomena. We propose that an attention-based RNN architecture can be used to simulate semantic priming for sequence labelling. Specifically, we employ pre-trained word embeddings to characterize the semantic relationship between utterances and labels. We validate the approach using varying sizes of the ATIS and MEDIA datasets, and show up to 1.4-1.9% improvement in F1 score. The developed framework can enable more explainable and generalizable spoken language understanding systems.
Anthology ID:
W18-2404
Volume:
Proceedings of the Seventh Named Entities Workshop
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Nancy Chen, Rafael E. Banchs, Xiangyu Duan, Min Zhang, Haizhou Li
Venue:
NEWS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
22–26
Language:
URL:
https://aclanthology.org/W18-2404
DOI:
10.18653/v1/W18-2404
Bibkey:
Cite (ACL):
Jiewen Wu, Rafael E. Banchs, Luis Fernando D’Haro, Pavitra Krishnaswamy, and Nancy Chen. 2018. Attention-based Semantic Priming for Slot-filling. In Proceedings of the Seventh Named Entities Workshop, pages 22–26, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Attention-based Semantic Priming for Slot-filling (Wu et al., NEWS 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/teach-a-man-to-fish/W18-2404.pdf