Hongfei Wang


2020

pdf bib
TMU-NLP System Using BERT-based Pre-trained Model to the NLP-TEA CGED Shared Task 2020
Hongfei Wang | Mamoru Komachi
Proceedings of the 6th Workshop on Natural Language Processing Techniques for Educational Applications

In this paper, we introduce our system for NLPTEA 2020 shared task of Chinese Grammatical Error Diagnosis (CGED). In recent years, pre-trained models have been extensively studied, and several downstream tasks have benefited from their utilization. In this study, we treat the grammar error diagnosis (GED) task as a grammatical error correction (GEC) problem and propose a method that incorporates a pre-trained model into an encoder-decoder model to solve this problem.

pdf bib
Chinese Grammatical Correction Using BERT-based Pre-trained Model
Hongfei Wang | Michiki Kurosawa | Satoru Katsumata | Mamoru Komachi
Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing

In recent years, pre-trained models have been extensively studied, and several downstream tasks have benefited from their utilization. In this study, we verify the effectiveness of two methods that incorporate a pre-trained model into an encoder-decoder model on Chinese grammatical error correction tasks. We also analyze the error type and conclude that sentence-level errors are yet to be addressed.