The impact of domain-specific representations on BERT-based multi-domain spoken language understanding

Judith Gaspers, Quynh Do, Tobias Röding, Melanie Bradford


Abstract
This paper provides the first experimental study on the impact of using domain-specific representations on a BERT-based multi-task spoken language understanding (SLU) model for multi-domain applications. Our results on a real-world dataset covering three languages indicate that by using domain-specific representations learned adversarially, model performance can be improved across all of the three SLU subtasks domain classification, intent classification and slot filling. Gains are particularly large for domains with limited training data.
Anthology ID:
2021.adaptnlp-1.4
Volume:
Proceedings of the Second Workshop on Domain Adaptation for NLP
Month:
April
Year:
2021
Address:
Kyiv, Ukraine
Venue:
AdaptNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
28–32
Language:
URL:
https://aclanthology.org/2021.adaptnlp-1.4
DOI:
Bibkey:
Cite (ACL):
Judith Gaspers, Quynh Do, Tobias Röding, and Melanie Bradford. 2021. The impact of domain-specific representations on BERT-based multi-domain spoken language understanding. In Proceedings of the Second Workshop on Domain Adaptation for NLP, pages 28–32, Kyiv, Ukraine. Association for Computational Linguistics.
Cite (Informal):
The impact of domain-specific representations on BERT-based multi-domain spoken language understanding (Gaspers et al., AdaptNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.adaptnlp-1.4.pdf