Multilingual Multimodal Learning with Machine Translated Text
Chen Qiu, Dan Oneață, Emanuele Bugliarello, Stella Frank, Desmond Elliott
Abstract
Most vision-and-language pretraining research focuses on English tasks. However, the creation of multilingual multimodal evaluation datasets (e.g. Multi30K, xGQA, XVNLI, and MaRVL) poses a new challenge in finding high-quality training data that is both multilingual and multimodal. In this paper, we investigate whether machine translating English multimodal data can be an effective proxy for the lack of readily available multilingual data. We call this framework TD-MML: Translated Data for Multilingual Multimodal Learning, and it can be applied to any multimodal dataset and model. We apply it to both pretraining and fine-tuning data with a state-of-the-art model. In order to prevent models from learning from low-quality translated text, we propose two metrics for automatically removing such translations from the resulting datasets. In experiments on five tasks across 20 languages in the IGLUE benchmark, we show that translated data can provide a useful signal for multilingual multimodal learning, both at pretraining and fine-tuning.- Anthology ID:
- 2022.findings-emnlp.308
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2022
- Month:
- December
- Year:
- 2022
- Address:
- Abu Dhabi, United Arab Emirates
- Editors:
- Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 4178–4193
- Language:
- URL:
- https://aclanthology.org/2022.findings-emnlp.308
- DOI:
- 10.18653/v1/2022.findings-emnlp.308
- Cite (ACL):
- Chen Qiu, Dan Oneață, Emanuele Bugliarello, Stella Frank, and Desmond Elliott. 2022. Multilingual Multimodal Learning with Machine Translated Text. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 4178–4193, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
- Cite (Informal):
- Multilingual Multimodal Learning with Machine Translated Text (Qiu et al., Findings 2022)
- PDF:
- https://preview.aclanthology.org/fix-dup-bibkey/2022.findings-emnlp.308.pdf