Abstract
We present a method for completing multilingual translation dictionaries. Our probabilistic approach can synthesize new word forms, allowing it to operate in settings where correct translations have not been observed in text (cf. cross-lingual embeddings). In addition, we propose an approximate Maximum Mutual Information (MMI) decoding objective to further improve performance in both many-to-one and one-to-one word level translation tasks where we use either multiple input languages for a single target language or more typical single language pair translation. The model is trained in a many-to-many setting, where it can leverage information from related languages to predict words in each of its many target languages. We focus on 6 languages: French, Spanish, Italian, Portuguese, Romanian, and Turkish. When indirect multilingual information is available, ensembling with mixture-of-experts as well as incorporating related languages leads to a 27% relative improvement in whole-word accuracy of predictions over a single-source baseline. To seed the completion when multilingual data is unavailable, it is better to decode with an MMI objective.- Anthology ID:
- 2020.coling-main.387
- Volume:
- Proceedings of the 28th International Conference on Computational Linguistics
- Month:
- December
- Year:
- 2020
- Address:
- Barcelona, Spain (Online)
- Editors:
- Donia Scott, Nuria Bel, Chengqing Zong
- Venue:
- COLING
- SIG:
- Publisher:
- International Committee on Computational Linguistics
- Note:
- Pages:
- 4373–4384
- Language:
- URL:
- https://aclanthology.org/2020.coling-main.387
- DOI:
- 10.18653/v1/2020.coling-main.387
- Cite (ACL):
- Dylan Lewis, Winston Wu, Arya D. McCarthy, and David Yarowsky. 2020. Neural Transduction for Multilingual Lexical Translation. In Proceedings of the 28th International Conference on Computational Linguistics, pages 4373–4384, Barcelona, Spain (Online). International Committee on Computational Linguistics.
- Cite (Informal):
- Neural Transduction for Multilingual Lexical Translation (Lewis et al., COLING 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2020.coling-main.387.pdf