Continual Learning of Neural Machine Translation within Low Forgetting Risk Regions

Shuhao Gu, Bojie Hu, Yang Feng


Abstract
This paper considers continual learning of large-scale pretrained neural machine translation model without accessing the previous training data or introducing model separation. We argue that the widely used regularization-based methods, which perform multi-objective learning with an auxiliary loss, suffer from the misestimate problem and cannot always achieve a good balance between the previous and new tasks. To solve the problem, we propose a two-stage training method based on the local features of the real loss. We first search low forgetting risk regions, where the model can retain the performance on the previous task as the parameters are updated, to avoid the catastrophic forgetting problem. Then we can continually train the model within this region only with the new training data to fit the new task. Specifically, we propose two methods to search the low forgetting risk regions, which are based on the curvature of loss and the impacts of the parameters on the model output, respectively. We conduct experiments on domain adaptation and more challenging language adaptation tasks, and the experimental results show that our method can achieve significant improvements compared with several strong baselines.
Anthology ID:
2022.emnlp-main.111
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1707–1718
Language:
URL:
https://aclanthology.org/2022.emnlp-main.111
DOI:
10.18653/v1/2022.emnlp-main.111
Bibkey:
Cite (ACL):
Shuhao Gu, Bojie Hu, and Yang Feng. 2022. Continual Learning of Neural Machine Translation within Low Forgetting Risk Regions. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 1707–1718, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Continual Learning of Neural Machine Translation within Low Forgetting Risk Regions (Gu et al., EMNLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/remove-xml-comments/2022.emnlp-main.111.pdf