Peng Xing
2025
MKT: A Multi-Stage Knowledge Transfer Framework to Mitigate Catastrophic Forgetting in Multi-Domain Chinese Spelling Correction
Peng Xing
|
Yinghui Li
|
Shirong Ma
|
Xinnian Liang
|
Haojing Huang
|
Yangning Li
|
Shu-Yu Guo
|
Hai-Tao Zheng
|
Wenhao Jiang
|
Ying Shen
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track
Chinese Spelling Correction (CSC) aims to detect and correct spelling errors in given sentences. Recently, multi-domain CSC has gradually attracted the attention of researchers because it is more practicable.In this paper, we focus on the key flaw of the CSC model when adapting to multi-domain scenarios: the tendency to forget previously acquired knowledge upon learning new domain-specific knowledge (i.e., **catastrophic forgetting**).To address this, we propose a novel model-agnostic **M**ulti-stage **K**nowledge **T**ransfer (**MKT**) framework with an evolving teacher model and dynamic distillation weights for knowledge transfer in each domain, rather than focusing solely on new domain knowledge.It deserves to be mentioned that we are the first to apply continual learning methods to the multi-domain CSC task. Experiments. prove our method’s effectiveness over traditional approaches, highlighting the importance of overcoming catastrophic forgetting to enhance model performance.
Search
Fix author
Co-authors
- Shu-Yu Guo 1
- Haojing Huang 1
- Wenhao Jiang 1
- Yinghui Li 1
- Yangning Li 1
- show all...