Towards Language Agnostic Universal Representations

Armen Aghajanyan, Xia Song, Saurabh Tiwary


Abstract
When a bilingual student learns to solve word problems in math, we expect the student to be able to solve these problem in both languages the student is fluent in, even if the math lessons were only taught in one language. However, current representations in machine learning are language dependent. In this work, we present a method to decouple the language from the problem by learning language agnostic representations and therefore allowing training a model in one language and applying to a different one in a zero shot fashion. We learn these representations by taking inspiration from linguistics, specifically the Universal Grammar hypothesis and learn universal latent representations that are language agnostic. We demonstrate the capabilities of these representations by showing that models trained on a single language using language agnostic representations achieve very similar accuracies in other languages.
Anthology ID:
P19-1395
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4033–4041
Language:
URL:
https://aclanthology.org/P19-1395
DOI:
10.18653/v1/P19-1395
Bibkey:
Cite (ACL):
Armen Aghajanyan, Xia Song, and Saurabh Tiwary. 2019. Towards Language Agnostic Universal Representations. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 4033–4041, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Towards Language Agnostic Universal Representations (Aghajanyan et al., ACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/P19-1395.pdf
Data
IMDb Movie ReviewsSNLI