Semi-Supervised Domain Adaptation for Emotion-Related Tasks

Mahshid Hosseini, Cornelia Caragea


Abstract
Semi-supervised domain adaptation (SSDA) adopts a model trained from a label-rich source domain to a new but related domain with a few labels of target data. It is shown that, in an SSDA setting, a simple combination of domain adaptation (DA) with semi-supervised learning (SSL) techniques often fails to effectively utilize the target supervision and cannot address distribution shifts across different domains due to the training data bias toward the source-labeled samples. In this paper, inspired by the co-learning of multiple classifiers for the computer vision tasks, we propose to decompose the SSDA framework for emotion-related tasks into two subcomponents of unsupervised domain adaptation (UDA) from the source to the target domain and semi-supervised learning (SSL) in the target domain where the two models iteratively teach each other by interchanging their high confident predictions. We further propose a novel data cartography-based regularization technique for pseudo-label denoising that employs training dynamics to further hone our models’ performance. We publicly release our code.
Anthology ID:
2023.findings-acl.333
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5402–5410
Language:
URL:
https://aclanthology.org/2023.findings-acl.333
DOI:
10.18653/v1/2023.findings-acl.333
Bibkey:
Cite (ACL):
Mahshid Hosseini and Cornelia Caragea. 2023. Semi-Supervised Domain Adaptation for Emotion-Related Tasks. In Findings of the Association for Computational Linguistics: ACL 2023, pages 5402–5410, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Semi-Supervised Domain Adaptation for Emotion-Related Tasks (Hosseini & Caragea, Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.findings-acl.333.pdf