Self-Adaptation for Unsupervised Domain Adaptation

Xia Cui, Danushka Bollegala


Abstract
Lack of labelled data in the target domain for training is a common problem in domain adaptation. To overcome this problem, we propose a novel unsupervised domain adaptation method that combines projection and self-training based approaches. Using the labelled data from the source domain, we first learn a projection that maximises the distance among the nearest neighbours with opposite labels in the source domain. Next, we project the source domain labelled data using the learnt projection and train a classifier for the target class prediction. We then use the trained classifier to predict pseudo labels for the target domain unlabelled data. Finally, we learn a projection for the target domain as we did for the source domain using the pseudo-labelled target domain data, where we maximise the distance between nearest neighbours having opposite pseudo labels. Experiments on a standard benchmark dataset for domain adaptation show that the proposed method consistently outperforms numerous baselines and returns competitive results comparable to that of SOTA including self-training, tri-training, and neural adaptations.
Anthology ID:
R19-1025
Volume:
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019)
Month:
September
Year:
2019
Address:
Varna, Bulgaria
Editors:
Ruslan Mitkov, Galia Angelova
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd.
Note:
Pages:
213–222
Language:
URL:
https://aclanthology.org/R19-1025
DOI:
10.26615/978-954-452-056-4_025
Bibkey:
Cite (ACL):
Xia Cui and Danushka Bollegala. 2019. Self-Adaptation for Unsupervised Domain Adaptation. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019), pages 213–222, Varna, Bulgaria. INCOMA Ltd..
Cite (Informal):
Self-Adaptation for Unsupervised Domain Adaptation (Cui & Bollegala, RANLP 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/R19-1025.pdf