“Wikily” Supervised Neural Translation Tailored to Cross-Lingual Tasks

Mohammad Sadegh Rasooli, Chris Callison-Burch, Derry Tanti Wijaya


Abstract
We present a simple but effective approach for leveraging Wikipedia for neural machine translation as well as cross-lingual tasks of image captioning and dependency parsing without using any direct supervision from external parallel data or supervised models in the target language. We show that first sentences and titles of linked Wikipedia pages, as well as cross-lingual image captions, are strong signals for a seed parallel data to extract bilingual dictionaries and cross-lingual word embeddings for mining parallel text from Wikipedia. Our final model achieves high BLEU scores that are close to or sometimes higher than strong supervised baselines in low-resource languages; e.g. supervised BLEU of 4.0 versus 12.1 from our model in English-to-Kazakh. Moreover, we tailor our wikily translation models to unsupervised image captioning, and cross-lingual dependency parser transfer. In image captioning, we train a multi-tasking machine translation and image captioning pipeline for Arabic and English from which the Arabic training data is a wikily translation of the English captioning data. Our captioning results on Arabic are slightly better than that of its supervised model. In dependency parsing, we translate a large amount of monolingual text, and use it as an artificial training data in an annotation projection framework. We show that our model outperforms recent work on cross-lingual transfer of dependency parsers.
Anthology ID:
2021.emnlp-main.124
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1655–1670
Language:
URL:
https://aclanthology.org/2021.emnlp-main.124
DOI:
10.18653/v1/2021.emnlp-main.124
Bibkey:
Cite (ACL):
Mohammad Sadegh Rasooli, Chris Callison-Burch, and Derry Tanti Wijaya. 2021. “Wikily” Supervised Neural Translation Tailored to Cross-Lingual Tasks. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 1655–1670, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
“Wikily” Supervised Neural Translation Tailored to Cross-Lingual Tasks (Rasooli et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.emnlp-main.124.pdf
Video:
 https://preview.aclanthology.org/ingestion-script-update/2021.emnlp-main.124.mp4
Data
Conceptual Captions