@inproceedings{cui-bollegala-2019-self,
    title = "Self-Adaptation for Unsupervised Domain Adaptation",
    author = "Cui, Xia  and
      Bollegala, Danushka",
    editor = "Mitkov, Ruslan  and
      Angelova, Galia",
    booktitle = "Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019)",
    month = sep,
    year = "2019",
    address = "Varna, Bulgaria",
    publisher = "INCOMA Ltd.",
    url = "https://preview.aclanthology.org/ingest-emnlp/R19-1025/",
    doi = "10.26615/978-954-452-056-4_025",
    pages = "213--222",
    abstract = "Lack of labelled data in the target domain for training is a common problem in domain adaptation. To overcome this problem, we propose a novel unsupervised domain adaptation method that combines projection and self-training based approaches. Using the labelled data from the source domain, we first learn a projection that maximises the distance among the nearest neighbours with opposite labels in the source domain. Next, we project the source domain labelled data using the learnt projection and train a classifier for the target class prediction. We then use the trained classifier to predict pseudo labels for the target domain unlabelled data. Finally, we learn a projection for the target domain as we did for the source domain using the pseudo-labelled target domain data, where we maximise the distance between nearest neighbours having opposite pseudo labels. Experiments on a standard benchmark dataset for domain adaptation show that the proposed method consistently outperforms numerous baselines and returns competitive results comparable to that of SOTA including self-training, tri-training, and neural adaptations."
}Markdown (Informal)
[Self-Adaptation for Unsupervised Domain Adaptation](https://preview.aclanthology.org/ingest-emnlp/R19-1025/) (Cui & Bollegala, RANLP 2019)
ACL