Rethinking Translation Memory Augmented Neural Machine Translation
Hongkun Hao, Guoping Huang, Lemao Liu, Zhirui Zhang, Shuming Shi, Rui Wang
Abstract
This paper rethinks translation memory augmented neural machine translation (TM-augmented NMT) from two perspectives, i.e., a probabilistic view of retrieval and the variance-bias decomposition principle. The finding demonstrates that TM-augmented NMT is good at the ability of fitting data (i.e., lower bias) but is more sensitive to the fluctuations in the training data (i.e., higher variance), which provides an explanation to a recently reported contradictory phenomenon on the same translation task: TM-augmented NMT substantially advances NMT without TM under the high resource scenario whereas it fails under the low resource scenario. Then this paper proposes a simple yet effective TM-augmented NMT model to promote the variance and address the contradictory phenomenon. Extensive experiments show that the proposed TM-augmented NMT achieves consistent gains over both conventional NMT and existing TM-augmented NMT under two variance-preferable (low resource and plug-and-play) scenarios as well as the high resource scenario.- Anthology ID:
- 2023.findings-acl.162
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2023
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2589–2605
- Language:
- URL:
- https://aclanthology.org/2023.findings-acl.162
- DOI:
- 10.18653/v1/2023.findings-acl.162
- Cite (ACL):
- Hongkun Hao, Guoping Huang, Lemao Liu, Zhirui Zhang, Shuming Shi, and Rui Wang. 2023. Rethinking Translation Memory Augmented Neural Machine Translation. In Findings of the Association for Computational Linguistics: ACL 2023, pages 2589–2605, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Rethinking Translation Memory Augmented Neural Machine Translation (Hao et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2023.findings-acl.162.pdf