Are Pre-trained Language Models Useful for Model Ensemble in Chinese Grammatical Error Correction?

Chenming Tang, Xiuyu Wu, Yunfang Wu


Abstract
Model ensemble has been in widespread use for Grammatical Error Correction (GEC), boosting model performance. We hypothesize that model ensemble based on the perplexity (PPL) computed by pre-trained language models (PLMs) should benefit the GEC system. To this end, we explore several ensemble strategies based on strong PLMs with four sophisticated single models. However, the performance does not improve but even gets worse after the PLM-based ensemble. This surprising result sets us doing a detailed analysis on the data and coming up with some insights on GEC. The human references of correct sentences is far from sufficient in the test data, and the gap between a correct sentence and an idiomatic one is worth our attention. Moreover, the PLM-based ensemble strategies provide an effective way to extend and improve GEC benchmark data. Our source code is available at https://github.com/JamyDon/PLM-based-CGEC-Model-Ensemble.
Anthology ID:
2023.acl-short.77
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
893–901
Language:
URL:
https://aclanthology.org/2023.acl-short.77
DOI:
10.18653/v1/2023.acl-short.77
Bibkey:
Cite (ACL):
Chenming Tang, Xiuyu Wu, and Yunfang Wu. 2023. Are Pre-trained Language Models Useful for Model Ensemble in Chinese Grammatical Error Correction?. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 893–901, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Are Pre-trained Language Models Useful for Model Ensemble in Chinese Grammatical Error Correction? (Tang et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2023.acl-short.77.pdf
Video:
 https://preview.aclanthology.org/landing_page/2023.acl-short.77.mp4