Revisiting Checkpoint Averaging for Neural Machine Translation

Yingbo Gao, Christian Herold, Zijian Yang, Hermann Ney


Abstract
Checkpoint averaging is a simple and effective method to boost the performance of converged neural machine translation models. The calculation is cheap to perform and the fact that the translation improvement almost comes for free, makes it widely adopted in neural machine translation research. Despite the popularity, the method itself simply takes the mean of the model parameters from several checkpoints, the selection of which is mostly based on empirical recipes without many justifications. In this work, we revisit the concept of checkpoint averaging and consider several extensions. Specifically, we experiment with ideas such as using different checkpoint selection strategies, calculating weighted average instead of simple mean, making use of gradient information and fine-tuning the interpolation weights on development data. Our results confirm the necessity of applying checkpoint averaging for optimal performance, but also suggest that the landscape between the converged checkpoints is rather flat and not much further improvement compared to simple averaging is to be obtained.
Anthology ID:
2022.findings-aacl.18
Volume:
Findings of the Association for Computational Linguistics: AACL-IJCNLP 2022
Month:
November
Year:
2022
Address:
Online only
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
188–196
Language:
URL:
https://aclanthology.org/2022.findings-aacl.18
DOI:
Bibkey:
Cite (ACL):
Yingbo Gao, Christian Herold, Zijian Yang, and Hermann Ney. 2022. Revisiting Checkpoint Averaging for Neural Machine Translation. In Findings of the Association for Computational Linguistics: AACL-IJCNLP 2022, pages 188–196, Online only. Association for Computational Linguistics.
Cite (Informal):
Revisiting Checkpoint Averaging for Neural Machine Translation (Gao et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.findings-aacl.18.pdf