Grammatical Error Correction through Round-Trip Machine Translation

Yova Kementchedjhieva, Anders Søgaard


Abstract
Machine translation (MT) operates on the premise of an interlingua which abstracts away from surface form while preserving meaning. A decade ago the idea of using round-trip MT to guide grammatical error correction was proposed as a way to abstract away from potential errors in surface forms (Madnani _et_ _al_., 2012). At the time, it did not pan out due to the low quality of MT systems of the day. Today much stronger MT systems are available so we re-evaluate this idea across five languages and models of various sizes. We find that for extra large models input augmentation through round-trip MT has little to no effect. For more ‘workable’ model sizes, however, it yields consistent improvements, sometimes bringing the performance of a _base_ or _large_ model up to that of a _large_ or _xl_ model, respectively. The round-trip translation comes at a computational cost though, so one would have to determine whether to opt for a larger model or for input augmentation on a case-by-case basis.
Anthology ID:
2023.findings-eacl.165
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2163–2170
Language:
URL:
https://aclanthology.org/2023.findings-eacl.165
DOI:
Bibkey:
Cite (ACL):
Yova Kementchedjhieva and Anders Søgaard. 2023. Grammatical Error Correction through Round-Trip Machine Translation. In Findings of the Association for Computational Linguistics: EACL 2023, pages 2163–2170, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Grammatical Error Correction through Round-Trip Machine Translation (Kementchedjhieva & Søgaard, Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-url/2023.findings-eacl.165.pdf