Investigating Catastrophic Forgetting During Continual Training for Neural Machine Translation

Shuhao Gu, Yang Feng


Abstract
Neural machine translation (NMT) models usually suffer from catastrophic forgetting during continual training where the models tend to gradually forget previously learned knowledge and swing to fit the newly added data which may have a different distribution, e.g. a different domain. Although many methods have been proposed to solve this problem, we cannot get to know what causes this phenomenon yet. Under the background of domain adaptation, we investigate the cause of catastrophic forgetting from the perspectives of modules and parameters (neurons). The investigation on the modules of the NMT model shows that some modules have tight relation with the general-domain knowledge while some other modules are more essential in the domain adaptation. And the investigation on the parameters shows that some parameters are important for both the general-domain and in-domain translation and the great change of them during continual training brings about the performance decline in general-domain. We conducted experiments across different language pairs and domains to ensure the validity and reliability of our findings.
Anthology ID:
2020.coling-main.381
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
4315–4326
Language:
URL:
https://aclanthology.org/2020.coling-main.381
DOI:
10.18653/v1/2020.coling-main.381
Bibkey:
Cite (ACL):
Shuhao Gu and Yang Feng. 2020. Investigating Catastrophic Forgetting During Continual Training for Neural Machine Translation. In Proceedings of the 28th International Conference on Computational Linguistics, pages 4315–4326, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Investigating Catastrophic Forgetting During Continual Training for Neural Machine Translation (Gu & Feng, COLING 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/2020.coling-main.381.pdf