Abstract
Pre-training masked language models (MLMs) with artificial data has been proven beneficial for several natural language processing tasks such as natural language understanding and summarization; however, it has been less explored for neural machine translation (NMT).A previous study revealed the benefit of transfer learning for NMT in a limited setup, which differs from MLM.In this study, we prepared two kinds of artificial data and compared the translation performance of NMT when pre-trained with MLM.In addition to the random sequences, we created artificial data mimicking token frequency information from the real world. Our results showed that pre-training the models with artificial data by MLM improves translation performance in low-resource situations. Additionally, we found that pre-training on artificial data created considering token frequency information facilitates improved performance.- Anthology ID:
- 2023.findings-eacl.166
- Volume:
- Findings of the Association for Computational Linguistics: EACL 2023
- Month:
- May
- Year:
- 2023
- Address:
- Dubrovnik, Croatia
- Editors:
- Andreas Vlachos, Isabelle Augenstein
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2216–2225
- Language:
- URL:
- https://aclanthology.org/2023.findings-eacl.166
- DOI:
- 10.18653/v1/2023.findings-eacl.166
- Cite (ACL):
- Hiroto Tamura, Tosho Hirasawa, Hwichan Kim, and Mamoru Komachi. 2023. Does Masked Language Model Pre-training with Artificial Data Improve Low-resource Neural Machine Translation?. In Findings of the Association for Computational Linguistics: EACL 2023, pages 2216–2225, Dubrovnik, Croatia. Association for Computational Linguistics.
- Cite (Informal):
- Does Masked Language Model Pre-training with Artificial Data Improve Low-resource Neural Machine Translation? (Tamura et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/corrections-2024-07/2023.findings-eacl.166.pdf