Single Model Ensemble for Subword Regularized Models in Low-Resource Machine Translation

Sho Takase, Tatsuya Hiraoka, Naoaki Okazaki


Abstract
Subword regularizations use multiple subword segmentations during training to improve the robustness of neural machine translation models.In previous subword regularizations, we use multiple segmentations in the training process but use only one segmentation in the inference.In this study, we propose an inference strategy to address this discrepancy.The proposed strategy approximates the marginalized likelihood by using multiple segmentations including the most plausible segmentation and several sampled segmentations.Because the proposed strategy aggregates predictions from several segmentations, we can regard it as a single model ensemble that does not require any additional cost for training.Experimental results show that the proposed strategy improves the performance of models trained with subword regularization in low-resource machine translation tasks.
Anthology ID:
2022.findings-acl.199
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2536–2541
Language:
URL:
https://aclanthology.org/2022.findings-acl.199
DOI:
10.18653/v1/2022.findings-acl.199
Bibkey:
Cite (ACL):
Sho Takase, Tatsuya Hiraoka, and Naoaki Okazaki. 2022. Single Model Ensemble for Subword Regularized Models in Low-Resource Machine Translation. In Findings of the Association for Computational Linguistics: ACL 2022, pages 2536–2541, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Single Model Ensemble for Subword Regularized Models in Low-Resource Machine Translation (Takase et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.findings-acl.199.pdf