Multiple Pivot Languages and Strategic Decoder Initialization Helps Neural Machine Translation

Shivam Mhaskar, Pushpak Bhattacharyya


Abstract
In machine translation, a pivot language can be used to assist the source to target translation model. In pivot-based transfer learning, the source to pivot and the pivot to target models are used to improve the performance of the source to target model. This technique works best when both source-pivot and pivot-target are high resource language pairs and the source-target is a low resource language pair. But in some cases, such as Indic languages, the pivot to target language pair is not a high resource one. To overcome this limitation, we use multiple related languages as pivot languages to assist the source to target model. We show that using multiple pivot languages gives 2.03 BLEU and 3.05 chrF score improvement over the baseline model. We show that strategic decoder initialization while performing pivot-based transfer learning with multiple pivot languages gives a 3.67 BLEU and 5.94 chrF score improvement over the baseline model.
Anthology ID:
2022.loresmt-1.2
Volume:
Proceedings of the Fifth Workshop on Technologies for Machine Translation of Low-Resource Languages (LoResMT 2022)
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
LoResMT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9–14
Language:
URL:
https://aclanthology.org/2022.loresmt-1.2
DOI:
Bibkey:
Cite (ACL):
Shivam Mhaskar and Pushpak Bhattacharyya. 2022. Multiple Pivot Languages and Strategic Decoder Initialization Helps Neural Machine Translation. In Proceedings of the Fifth Workshop on Technologies for Machine Translation of Low-Resource Languages (LoResMT 2022), pages 9–14, Gyeongju, Republic of Korea. Association for Computational Linguistics.
Cite (Informal):
Multiple Pivot Languages and Strategic Decoder Initialization Helps Neural Machine Translation (Mhaskar & Bhattacharyya, LoResMT 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.loresmt-1.2.pdf