Efficient Grammatical Error Correction Via Multi-Task Training and Optimized Training Schedule

Andrey Bout, Alexander Podolskiy, Sergey Nikolenko, Irina Piontkovskaya


Abstract
Progress in neural grammatical error correction (GEC) is hindered by the lack of annotated training data. Sufficient amounts of high-quality manually annotated data are not available, so recent research has relied on generating synthetic data, pretraining on it, and then fine-tuning on real datasets; performance gains have been achieved either by ensembling or by using huge pretrained models such as XXL-T5 as the backbone. In this work, we explore an orthogonal direction: how to use available data more efficiently. First, we propose auxiliary tasks that exploit the alignment between the original and corrected sentences, such as predicting a sequence of corrections. We formulate each task as a sequence-to-sequence problem and perform multi-task training. Second, we discover that the order of datasets used for training and even individual instances within a dataset may have important effects on the final performance, so we set out to find the best training schedule. Together, these two ideas lead to significant improvements, producing results that improve state of the art with much smaller models; in particular, we outperform the best models based on T5-XXL (11B parameters) with a BART-based model (400M parameters).
Anthology ID:
2023.emnlp-main.355
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5800–5816
Language:
URL:
https://aclanthology.org/2023.emnlp-main.355
DOI:
10.18653/v1/2023.emnlp-main.355
Bibkey:
Cite (ACL):
Andrey Bout, Alexander Podolskiy, Sergey Nikolenko, and Irina Piontkovskaya. 2023. Efficient Grammatical Error Correction Via Multi-Task Training and Optimized Training Schedule. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 5800–5816, Singapore. Association for Computational Linguistics.
Cite (Informal):
Efficient Grammatical Error Correction Via Multi-Task Training and Optimized Training Schedule (Bout et al., EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2023.emnlp-main.355.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-4/2023.emnlp-main.355.mp4