Chinese Grammatical Correction Using BERT-based Pre-trained Model

Hongfei Wang, Michiki Kurosawa, Satoru Katsumata, Mamoru Komachi


Abstract
In recent years, pre-trained models have been extensively studied, and several downstream tasks have benefited from their utilization. In this study, we verify the effectiveness of two methods that incorporate a pre-trained model into an encoder-decoder model on Chinese grammatical error correction tasks. We also analyze the error type and conclude that sentence-level errors are yet to be addressed.
Anthology ID:
2020.aacl-main.20
Volume:
Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing
Month:
December
Year:
2020
Address:
Suzhou, China
Editors:
Kam-Fai Wong, Kevin Knight, Hua Wu
Venue:
AACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
163–168
Language:
URL:
https://aclanthology.org/2020.aacl-main.20
DOI:
Bibkey:
Cite (ACL):
Hongfei Wang, Michiki Kurosawa, Satoru Katsumata, and Mamoru Komachi. 2020. Chinese Grammatical Correction Using BERT-based Pre-trained Model. In Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing, pages 163–168, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Chinese Grammatical Correction Using BERT-based Pre-trained Model (Wang et al., AACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2020.aacl-main.20.pdf