Overcoming Catastrophic Forgetting beyond Continual Learning: Balanced Training for Neural Machine Translation

Chenze Shao, Yang Feng


Abstract
Neural networks tend to gradually forget the previously learned knowledge when learning multiple tasks sequentially from dynamic data distributions. This problem is called catastrophic forgetting, which is a fundamental challenge in the continual learning of neural networks. In this work, we observe that catastrophic forgetting not only occurs in continual learning but also affects the traditional static training. Neural networks, especially neural machine translation models, suffer from catastrophic forgetting even if they learn from a static training set. To be specific, the final model pays imbalanced attention to training samples, where recently exposed samples attract more attention than earlier samples. The underlying cause is that training samples do not get balanced training in each model update, so we name this problem imbalanced training. To alleviate this problem, we propose Complementary Online Knowledge Distillation (COKD), which uses dynamically updated teacher models trained on specific data orders to iteratively provide complementary knowledge to the student model. Experimental results on multiple machine translation tasks show that our method successfully alleviates the problem of imbalanced training and achieves substantial improvements over strong baseline systems.
Anthology ID:
2022.acl-long.143
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2023–2036
Language:
URL:
https://aclanthology.org/2022.acl-long.143
DOI:
10.18653/v1/2022.acl-long.143
Bibkey:
Cite (ACL):
Chenze Shao and Yang Feng. 2022. Overcoming Catastrophic Forgetting beyond Continual Learning: Balanced Training for Neural Machine Translation. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2023–2036, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Overcoming Catastrophic Forgetting beyond Continual Learning: Balanced Training for Neural Machine Translation (Shao & Feng, ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.acl-long.143.pdf
Code
 ictnlp/cokd
Data
CIFAR-10CIFAR-100