Self-Supervised Curriculum Learning for Spelling Error Correction

Zifa Gan, Hongfei Xu, Hongying Zan


Abstract
Spelling Error Correction (SEC) that requires high-level language understanding is a challenging but useful task. Current SEC approaches normally leverage a pre-training then fine-tuning procedure that treats data equally. By contrast, Curriculum Learning (CL) utilizes training data differently during training and has shown its effectiveness in improving both performance and training efficiency in many other NLP tasks. In NMT, a model’s performance has been shown sensitive to the difficulty of training examples, and CL has been shown effective to address this. In SEC, the data from different language learners are naturally distributed at different difficulty levels (some errors made by beginners are obvious to correct while some made by fluent speakers are hard), and we expect that designing a curriculum correspondingly for model learning may also help its training and bring about better performance. In this paper, we study how to further improve the performance of the state-of-the-art SEC method with CL, and propose a Self-Supervised Curriculum Learning (SSCL) approach. Specifically, we directly use the cross-entropy loss as criteria for: 1) scoring the difficulty of training data, and 2) evaluating the competence of the model. In our approach, CL improves the model training, which in return improves the CL measurement. In our experiments on the SIGHAN 2015 Chinese spelling check task, we show that SSCL is superior to previous norm-based and uncertainty-aware approaches, and establish a new state of the art (74.38% F1).
Anthology ID:
2021.emnlp-main.281
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3487–3494
Language:
URL:
https://aclanthology.org/2021.emnlp-main.281
DOI:
10.18653/v1/2021.emnlp-main.281
Bibkey:
Cite (ACL):
Zifa Gan, Hongfei Xu, and Hongying Zan. 2021. Self-Supervised Curriculum Learning for Spelling Error Correction. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 3487–3494, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Self-Supervised Curriculum Learning for Spelling Error Correction (Gan et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/nodalida-main-page/2021.emnlp-main.281.pdf
Video:
 https://preview.aclanthology.org/nodalida-main-page/2021.emnlp-main.281.mp4