Regression-Free Model Updates for Spoken Language Understanding

Andrea Caciolai, Verena Weber, Tobias Falke, Alessandro Pedrani, Davide Bernardi


Abstract
In real-world systems, an important requirement for model updates is to avoid regressions in user experience caused by flips of previously correct classifications to incorrect ones. Multiple techniques for that have been proposed in the recent literature. In this paper, we apply one such technique, focal distillation, to model updates in a goal-oriented dialog system and assess its usefulness in practice. In particular, we evaluate its effectiveness for key language understanding tasks, including sentence classification and sequence labeling tasks, we further assess its effect when applied to repeated model updates over time, and test its compatibility with mislabeled data. Our experiments on a public benchmark and data from a deployed dialog system demonstrate that focal distillation can substantially reduce regressions, at only minor drops in accuracy, and that it further outperforms naive supervised training in challenging mislabeled data and label expansion settings.
Anthology ID:
2023.acl-industry.52
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 5: Industry Track)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Sunayana Sitaram, Beata Beigman Klebanov, Jason D Williams
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
538–551
Language:
URL:
https://aclanthology.org/2023.acl-industry.52
DOI:
10.18653/v1/2023.acl-industry.52
Bibkey:
Cite (ACL):
Andrea Caciolai, Verena Weber, Tobias Falke, Alessandro Pedrani, and Davide Bernardi. 2023. Regression-Free Model Updates for Spoken Language Understanding. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 5: Industry Track), pages 538–551, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Regression-Free Model Updates for Spoken Language Understanding (Caciolai et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2023.acl-industry.52.pdf