Margin-aware Unsupervised Domain Adaptation for Cross-lingual Text Labeling

Dejiao Zhang, Ramesh Nallapati, Henghui Zhu, Feng Nan, Cicero Nogueira dos Santos, Kathleen McKeown, Bing Xiang


Abstract
Unsupervised domain adaptation addresses the problem of leveraging labeled data in a source domain to learn a well-performing model in a target domain where labels are unavailable. In this paper, we improve upon a recent theoretical work (Zhang et al., 2019b) and adopt the Margin Disparity Discrepancy (MDD) unsupervised domain adaptation algorithm to solve the cross-lingual text labeling problems. Experiments on cross-lingual document classification and NER demonstrate the proposed domain adaptation approach advances the state-of-the-art results by a large margin. Specifically, we improve MDD by efficiently optimizing the margin loss on the source domain via Virtual Adversarial Training (VAT). This bridges the gap between theory and the loss function used in the original work Zhang et al.(2019b), and thereby significantly boosts the performance. Our numerical results also indicate that VAT can remarkably improve the generalization performance of both domains for various domain adaptation approaches.
Anthology ID:
2020.findings-emnlp.315
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3527–3536
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.315
DOI:
10.18653/v1/2020.findings-emnlp.315
Bibkey:
Cite (ACL):
Dejiao Zhang, Ramesh Nallapati, Henghui Zhu, Feng Nan, Cicero Nogueira dos Santos, Kathleen McKeown, and Bing Xiang. 2020. Margin-aware Unsupervised Domain Adaptation for Cross-lingual Text Labeling. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 3527–3536, Online. Association for Computational Linguistics.
Cite (Informal):
Margin-aware Unsupervised Domain Adaptation for Cross-lingual Text Labeling (Zhang et al., Findings 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2020.findings-emnlp.315.pdf
Data
MLDoc