Yan Pan
2023
Data-Augmented Task-Oriented Dialogue Response Generation with Domain Adaptation
Yan Pan
|
Davide Cadamuro
|
Georg Groh
Proceedings of the 37th Pacific Asia Conference on Language, Information and Computation
2022
User Satisfaction Modeling with Domain Adaptation in Task-oriented Dialogue Systems
Yan Pan
|
Mingyang Ma
|
Bernhard Pflugfelder
|
Georg Groh
Proceedings of the 23rd Annual Meeting of the Special Interest Group on Discourse and Dialogue
User Satisfaction Estimation (USE) is crucial in helping measure the quality of a task-oriented dialogue system. However, the complex nature of implicit responses poses challenges in detecting user satisfaction, and most datasets are limited in size or not available to the public due to user privacy policies. Unlike task-oriented dialogue, large-scale annotated chitchat with emotion labels is publicly available. Therefore, we present a novel user satisfaction model with domain adaptation (USMDA) to utilize this chitchat. We adopt a dialogue Transformer encoder to capture contextual features from the dialogue. And we reduce domain discrepancy to learn dialogue-related invariant features. Moreover, USMDA jointly learns satisfaction signals in the chitchat context with user satisfaction estimation, and user actions in task-oriented dialogue with dialogue action recognition. Experimental results on two benchmarks show that our proposed framework for the USE task outperforms existing unsupervised domain adaptation methods. To the best of our knowledge, this is the first work to study user satisfaction estimation with unsupervised domain adaptation from chitchat to task-oriented dialogue.
2016
Modelling Sentence Pairs with Tree-structured Attentive Encoder
Yao Zhou
|
Cong Liu
|
Yan Pan
Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers
We describe an attentive encoder that combines tree-structured recursive neural networks and sequential recurrent neural networks for modelling sentence pairs. Since existing attentive models exert attention on the sequential structure, we propose a way to incorporate attention into the tree topology. Specially, given a pair of sentences, our attentive encoder uses the representation of one sentence, which generated via an RNN, to guide the structural encoding of the other sentence on the dependency parse tree. We evaluate the proposed attentive encoder on three tasks: semantic similarity, paraphrase identification and true-false question selection. Experimental results show that our encoder outperforms all baselines and achieves state-of-the-art results on two tasks.
Search
Co-authors
- Georg Groh 2
- Yao Zhou 1
- Cong Liu 1
- Davide Cadamuro 1
- Mingyang Ma 1
- show all...