Abstract
We present a novel multi-task modeling approach to learning multilingual distributed representations of text. Our system learns word and sentence embeddings jointly by training a multilingual skip-gram model together with a cross-lingual sentence similarity model. Our architecture can transparently use both monolingual and sentence aligned bilingual corpora to learn multilingual embeddings, thus covering a vocabulary significantly larger than the vocabulary of the bilingual corpora alone. Our model shows competitive performance in a standard cross-lingual document classification task. We also show the effectiveness of our method in a limited resource scenario.- Anthology ID:
- P18-2035
- Volume:
- Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
- Month:
- July
- Year:
- 2018
- Address:
- Melbourne, Australia
- Editors:
- Iryna Gurevych, Yusuke Miyao
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 214–220
- Language:
- URL:
- https://aclanthology.org/P18-2035
- DOI:
- 10.18653/v1/P18-2035
- Cite (ACL):
- Karan Singla, Dogan Can, and Shrikanth Narayanan. 2018. A Multi-task Approach to Learning Multilingual Representations. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 214–220, Melbourne, Australia. Association for Computational Linguistics.
- Cite (Informal):
- A Multi-task Approach to Learning Multilingual Representations (Singla et al., ACL 2018)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-3/P18-2035.pdf