Inducing Language-Agnostic Multilingual Representations

Wei Zhao, Steffen Eger, Johannes Bjerva, Isabelle Augenstein


Abstract
Cross-lingual representations have the potential to make NLP techniques available to the vast majority of languages in the world. However, they currently require large pretraining corpora or access to typologically similar languages. In this work, we address these obstacles by removing language identity signals from multilingual embeddings. We examine three approaches for this: (i) re-aligning the vector spaces of target languages (all together) to a pivot source language; (ii) removing language-specific means and variances, which yields better discriminativeness of embeddings as a by-product; and (iii) increasing input similarity across languages by removing morphological contractions and sentence reordering. We evaluate on XNLI and reference-free MT evaluation across 19 typologically diverse languages. Our findings expose the limitations of these approaches—unlike vector normalization, vector space re-alignment and text normalization do not achieve consistent gains across encoders and languages. Due to the approaches’ additive effects, their combination decreases the cross-lingual transfer gap by 8.9 points (m-BERT) and 18.2 points (XLM-R) on average across all tasks and languages, however.
Anthology ID:
2021.starsem-1.22
Volume:
Proceedings of *SEM 2021: The Tenth Joint Conference on Lexical and Computational Semantics
Month:
August
Year:
2021
Address:
Online
Venue:
*SEM
SIG:
SIGSEM
Publisher:
Association for Computational Linguistics
Note:
Pages:
229–240
Language:
URL:
https://aclanthology.org/2021.starsem-1.22
DOI:
10.18653/v1/2021.starsem-1.22
Bibkey:
Cite (ACL):
Wei Zhao, Steffen Eger, Johannes Bjerva, and Isabelle Augenstein. 2021. Inducing Language-Agnostic Multilingual Representations. In Proceedings of *SEM 2021: The Tenth Joint Conference on Lexical and Computational Semantics, pages 229–240, Online. Association for Computational Linguistics.
Cite (Informal):
Inducing Language-Agnostic Multilingual Representations (Zhao et al., *SEM 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2021.starsem-1.22.pdf
Optional supplementary material:
 2021.starsem-1.22.OptionalSupplementaryMaterial.pdf
Code
 AIPHES/Language-Agnostic-Contextualized-Encoders
Data
XNLI