Abstract
We propose a mixture-of-experts approach for unsupervised domain adaptation from multiple sources. The key idea is to explicitly capture the relationship between a target example and different source domains. This relationship, expressed by a point-to-set metric, determines how to combine predictors trained on various domains. The metric is learned in an unsupervised fashion using meta-training. Experimental results on sentiment analysis and part-of-speech tagging demonstrate that our approach consistently outperforms multiple baselines and can robustly handle negative transfer.- Anthology ID:
- D18-1498
- Volume:
- Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
- Month:
- October-November
- Year:
- 2018
- Address:
- Brussels, Belgium
- Editors:
- Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
- Venue:
- EMNLP
- SIG:
- SIGDAT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 4694–4703
- Language:
- URL:
- https://aclanthology.org/D18-1498
- DOI:
- 10.18653/v1/D18-1498
- Cite (ACL):
- Jiang Guo, Darsh Shah, and Regina Barzilay. 2018. Multi-Source Domain Adaptation with Mixture of Experts. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4694–4703, Brussels, Belgium. Association for Computational Linguistics.
- Cite (Informal):
- Multi-Source Domain Adaptation with Mixture of Experts (Guo et al., EMNLP 2018)
- PDF:
- https://preview.aclanthology.org/ml4al-ingestion/D18-1498.pdf
- Code
- jiangfeng1124/transfer