Enhancing the generalization for Intent Classification and Out-of-Domain Detection in SLU

Yilin Shen, Yen-Chang Hsu, Avik Ray, Hongxia Jin


Abstract
Intent classification is a major task in spoken language understanding (SLU). Since most models are built with pre-collected in-domain (IND) training utterances, their ability to detect unsupported out-of-domain (OOD) utterances has a critical effect in practical use. Recent works have shown that using extra data and labels can improve the OOD detection performance, yet it could be costly to collect such data. This paper proposes to train a model with only IND data while supporting both IND intent classification and OOD detection. Our method designs a novel domain-regularized module (DRM) to reduce the overconfident phenomenon of a vanilla classifier, achieving a better generalization in both cases. Besides, DRM can be used as a drop-in replacement for the last layer in any neural network-based intent classifier, providing a low-cost strategy for a significant improvement. The evaluation on four datasets shows that our method built on BERT and RoBERTa models achieves state-of-the-art performance against existing approaches and the strong baselines we created for the comparisons.
Anthology ID:
2021.acl-long.190
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2443–2453
Language:
URL:
https://aclanthology.org/2021.acl-long.190
DOI:
10.18653/v1/2021.acl-long.190
Bibkey:
Cite (ACL):
Yilin Shen, Yen-Chang Hsu, Avik Ray, and Hongxia Jin. 2021. Enhancing the generalization for Intent Classification and Out-of-Domain Detection in SLU. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 2443–2453, Online. Association for Computational Linguistics.
Cite (Informal):
Enhancing the generalization for Intent Classification and Out-of-Domain Detection in SLU (Shen et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/paclic-22-ingestion/2021.acl-long.190.pdf
Video:
 https://preview.aclanthology.org/paclic-22-ingestion/2021.acl-long.190.mp4
Data
ATIS