Abstract
We introduce a neural network model that marries together ideas from two prominent strands of research on domain adaptation through representation learning: structural correspondence learning (SCL, (Blitzer et al., 2006)) and autoencoder neural networks (NNs). Our model is a three-layer NN that learns to encode the non-pivot features of an input example into a low dimensional representation, so that the existence of pivot features (features that are prominent in both domains and convey useful information for the NLP task) in the example can be decoded from that representation. The low-dimensional representation is then employed in a learning algorithm for the task. Moreover, we show how to inject pre-trained word embeddings into our model in order to improve generalization across examples with similar pivot features. We experiment with the task of cross-domain sentiment classification on 16 domain pairs and show substantial improvements over strong baselines.- Anthology ID:
- K17-1040
- Volume:
- Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017)
- Month:
- August
- Year:
- 2017
- Address:
- Vancouver, Canada
- Editors:
- Roger Levy, Lucia Specia
- Venue:
- CoNLL
- SIG:
- SIGNLL
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 400–410
- Language:
- URL:
- https://aclanthology.org/K17-1040
- DOI:
- 10.18653/v1/K17-1040
- Cite (ACL):
- Yftah Ziser and Roi Reichart. 2017. Neural Structural Correspondence Learning for Domain Adaptation. In Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017), pages 400–410, Vancouver, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Neural Structural Correspondence Learning for Domain Adaptation (Ziser & Reichart, CoNLL 2017)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/K17-1040.pdf
- Code
- yftah89/Neural-SCLDomain-Adaptation + additional community code