Focal Training and Tagger Decouple for Grammatical Error Correction

Minghuan Tan, Min Yang, Ruifeng Xu


Abstract
In this paper, we investigate how to improve tagging-based Grammatical Error Correction models. We address two issues of current tagging-based approaches, label imbalance issue, and tagging entanglement issue. Then we propose to down-weight the loss of well-classified labels using Focal Loss and decouple the error detection layer from the label tagging layer through an extra self-attention-based matching module. Experiments over three latest Chinese Grammatical Error Correction datasets show that our proposed methods are effective. We further analyze choices of hyper-parameters for Focal Loss and inference tweaking.
Anthology ID:
2023.findings-acl.370
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5978–5985
Language:
URL:
https://aclanthology.org/2023.findings-acl.370
DOI:
10.18653/v1/2023.findings-acl.370
Bibkey:
Cite (ACL):
Minghuan Tan, Min Yang, and Ruifeng Xu. 2023. Focal Training and Tagger Decouple for Grammatical Error Correction. In Findings of the Association for Computational Linguistics: ACL 2023, pages 5978–5985, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Focal Training and Tagger Decouple for Grammatical Error Correction (Tan et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2023.findings-acl.370.pdf