DORIC : Domain Robust Fine-Tuning for Open Intent Clustering through Dependency Parsing

Jihyun Lee, Seungyeon Seo, Yunsu Kim, Gary Geunbae Lee


Abstract
We present our work on Track 2 in the Dialog System Technology Challenges 11 (DSTC11). DSTC11-Track2 aims to provide a benchmark for zero-shot, cross-domain, intent-set induction. In the absence of in-domain training dataset, robust utterance representation that can be used across domains is necessary to induce users’ intentions. To achieve this, we leveraged a multi-domain dialogue dataset to fine-tune the language model and proposed extracting Verb-Object pairs to remove the artifacts of unnecessary information. Furthermore, we devised the method that generates each cluster’s name for the explainability of clustered results. Our approach achieved 3rd place in the precision score and showed superior accuracy and normalized mutual information (NMI) score than the baseline model on various domain datasets.
Anthology ID:
2023.dstc-1.6
Volume:
Proceedings of The Eleventh Dialog System Technology Challenge
Month:
September
Year:
2023
Address:
Prague, Czech Republic
Editors:
Yun-Nung Chen, Paul Crook, Michel Galley, Sarik Ghazarian, Chulaka Gunasekara, Raghav Gupta, Behnam Hedayatnia, Satwik Kottur, Seungwhan Moon, Chen Zhang
Venues:
DSTC | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
40–47
Language:
URL:
https://aclanthology.org/2023.dstc-1.6
DOI:
Bibkey:
Cite (ACL):
Jihyun Lee, Seungyeon Seo, Yunsu Kim, and Gary Geunbae Lee. 2023. DORIC : Domain Robust Fine-Tuning for Open Intent Clustering through Dependency Parsing. In Proceedings of The Eleventh Dialog System Technology Challenge, pages 40–47, Prague, Czech Republic. Association for Computational Linguistics.
Cite (Informal):
DORIC : Domain Robust Fine-Tuning for Open Intent Clustering through Dependency Parsing (Lee et al., DSTC-WS 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2023.dstc-1.6.pdf