Cost-Sensitive Active Learning for Dialogue State Tracking

Kaige Xie, Cheng Chang, Liliang Ren, Lu Chen, Kai Yu


Abstract
Dialogue state tracking (DST), when formulated as a supervised learning problem, relies on labelled data. Since dialogue state annotation usually requires labelling all turns of a single dialogue and utilizing context information, it is very expensive to annotate all available unlabelled data. In this paper, a novel cost-sensitive active learning framework is proposed based on a set of new dialogue-level query strategies. This is the first attempt to apply active learning for dialogue state tracking. Experiments on DSTC2 show that active learning with mixed data query strategies can effectively achieve the same DST performance with significantly less data annotation compared to traditional training approaches.
Anthology ID:
W18-5022
Volume:
Proceedings of the 19th Annual SIGdial Meeting on Discourse and Dialogue
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Kazunori Komatani, Diane Litman, Kai Yu, Alex Papangelis, Lawrence Cavedon, Mikio Nakano
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
209–213
Language:
URL:
https://aclanthology.org/W18-5022
DOI:
10.18653/v1/W18-5022
Bibkey:
Cite (ACL):
Kaige Xie, Cheng Chang, Liliang Ren, Lu Chen, and Kai Yu. 2018. Cost-Sensitive Active Learning for Dialogue State Tracking. In Proceedings of the 19th Annual SIGdial Meeting on Discourse and Dialogue, pages 209–213, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Cost-Sensitive Active Learning for Dialogue State Tracking (Xie et al., SIGDIAL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/ml4al-ingestion/W18-5022.pdf