Supervised Domain Enablement Attention for Personalized Domain Classification

Joo-Kyung Kim, Young-Bum Kim


Abstract
In large-scale domain classification for natural language understanding, leveraging each user’s domain enablement information, which refers to the preferred or authenticated domains by the user, with attention mechanism has been shown to improve the overall domain classification performance. In this paper, we propose a supervised enablement attention mechanism, which utilizes sigmoid activation for the attention weighting so that the attention can be computed with more expressive power without the weight sum constraint of softmax attention. The attention weights are explicitly encouraged to be similar to the corresponding elements of the output one-hot vector, and self-distillation is used to leverage the attention information of the other enabled domains. By evaluating on the actual utterances from a large-scale IPDA, we show that our approach significantly improves domain classification performance
Anthology ID:
D18-1106
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
894–899
Language:
URL:
https://aclanthology.org/D18-1106
DOI:
10.18653/v1/D18-1106
Bibkey:
Cite (ACL):
Joo-Kyung Kim and Young-Bum Kim. 2018. Supervised Domain Enablement Attention for Personalized Domain Classification. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 894–899, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Supervised Domain Enablement Attention for Personalized Domain Classification (Kim & Kim, EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/D18-1106.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-2/D18-1106.mp4