Improving Zero-Shot Translation by Disentangling Positional Information
Danni Liu, Jan Niehues, James Cross, Francisco Guzmán, Xian Li
Abstract
Multilingual neural machine translation has shown the capability of directly translating between language pairs unseen in training, i.e. zero-shot translation. Despite being conceptually attractive, it often suffers from low output quality. The difficulty of generalizing to new translation directions suggests the model representations are highly specific to those language pairs seen in training. We demonstrate that a main factor causing the language-specific representations is the positional correspondence to input tokens. We show that this can be easily alleviated by removing residual connections in an encoder layer. With this modification, we gain up to 18.5 BLEU points on zero-shot translation while retaining quality on supervised directions. The improvements are particularly prominent between related languages, where our proposed model outperforms pivot-based translation. Moreover, our approach allows easy integration of new languages, which substantially expands translation coverage. By thorough inspections of the hidden layer outputs, we show that our approach indeed leads to more language-independent representations.- Anthology ID:
- 2021.acl-long.101
- Volume:
- Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
- Month:
- August
- Year:
- 2021
- Address:
- Online
- Editors:
- Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
- Venues:
- ACL | IJCNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1259–1273
- Language:
- URL:
- https://aclanthology.org/2021.acl-long.101
- DOI:
- 10.18653/v1/2021.acl-long.101
- Cite (ACL):
- Danni Liu, Jan Niehues, James Cross, Francisco Guzmán, and Xian Li. 2021. Improving Zero-Shot Translation by Disentangling Positional Information. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 1259–1273, Online. Association for Computational Linguistics.
- Cite (Informal):
- Improving Zero-Shot Translation by Disentangling Positional Information (Liu et al., ACL-IJCNLP 2021)
- PDF:
- https://preview.aclanthology.org/fix-dup-bibkey/2021.acl-long.101.pdf
- Code
- nlp-dke/NMTGMinor
- Data
- PMIndia