Subword Mapping and Anchoring across Languages

Giorgos Vernikos, Andrei Popescu-Belis


Abstract
State-of-the-art multilingual systems rely on shared vocabularies that sufficiently cover all considered languages. To this end, a simple and frequently used approach makes use of subword vocabularies constructed jointly over several languages. We hypothesize that such vocabularies are suboptimal due to false positives (identical subwords with different meanings across languages) and false negatives (different subwords with similar meanings). To address these issues, we propose Subword Mapping and Anchoring across Languages (SMALA), a method to construct bilingual subword vocabularies. SMALA extracts subword alignments using an unsupervised state-of-the-art mapping technique and uses them to create cross-lingual anchors based on subword similarities. We demonstrate the benefits of SMALA for cross-lingual natural language inference (XNLI), where it improves zero-shot transfer to an unseen language without task-specific data, but only by sharing subword embeddings. Moreover, in neural machine translation, we show that joint subword vocabularies obtained with SMALA lead to higher BLEU scores on sentences that contain many false positives and false negatives.
Anthology ID:
2021.findings-emnlp.224
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2633–2647
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.224
DOI:
10.18653/v1/2021.findings-emnlp.224
Bibkey:
Cite (ACL):
Giorgos Vernikos and Andrei Popescu-Belis. 2021. Subword Mapping and Anchoring across Languages. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 2633–2647, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Subword Mapping and Anchoring across Languages (Vernikos & Popescu-Belis, Findings 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.findings-emnlp.224.pdf
Video:
 https://preview.aclanthology.org/ingestion-script-update/2021.findings-emnlp.224.mp4
Code
 georgevern/smala
Data
XNLI