Zhang Ting
2023
Improving Zero-shot Cross-lingual Dialogue State Tracking via Contrastive Learning
Xiang Yu
|
Zhang Ting
|
Di Hui
|
Huang Hui
|
Li Chunyou
|
Ouchi Kazushige
|
Chen Yufeng
|
Xu Jinan
Proceedings of the 22nd Chinese National Conference on Computational Linguistics
“Recent works in dialogue state tracking (DST) focus on a handful of languages, as collectinglarge-scale manually annotated data in different languages is expensive. Existing models addressthis issue by code-switched data augmentation or intermediate fine-tuning of multilingual pre-trained models. However, these models can only perform implicit alignment across languages. In this paper, we propose a novel model named Contrastive Learning for Cross-Lingual DST(CLCL-DST) to enhance zero-shot cross-lingual adaptation. Specifically, we use a self-builtbilingual dictionary for lexical substitution to construct multilingual views of the same utterance. Then our approach leverages fine-grained contrastive learning to encourage representations ofspecific slot tokens in different views to be more similar than negative example pairs. By thismeans, CLCL-DST aligns similar words across languages into a more refined language-invariantspace. In addition, CLCL-DST uses a significance-based keyword extraction approach to selecttask-related words to build the bilingual dictionary for better cross-lingual positive examples. Experiment results on Multilingual WoZ 2.0 and parallel MultiWoZ 2.1 datasets show that ourproposed CLCL-DST outperforms existing state-of-the-art methods by a large margin, demon-strating the effectiveness of CLCL-DST.”
Search
Co-authors
- Xiang Yu 1
- Di Hui 1
- Huang Hui 1
- Li Chunyou 1
- Ouchi Kazushige 1
- show all...
Venues
- ccl1