@inproceedings{staruch-etal-2025-adapting,
    title = "Adapting {LLM}s for Minimal-edit Grammatical Error Correction",
    author = "Staruch, Ryszard  and
      Gralinski, Filip  and
      Dzienisiewicz, Daniel",
    editor = {Kochmar, Ekaterina  and
      Alhafni, Bashar  and
      Bexte, Marie  and
      Burstein, Jill  and
      Horbach, Andrea  and
      Laarmann-Quante, Ronja  and
      Tack, Ana{\"i}s  and
      Yaneva, Victoria  and
      Yuan, Zheng},
    booktitle = "Proceedings of the 20th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2025)",
    month = jul,
    year = "2025",
    address = "Vienna, Austria",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2025.bea-1.9/",
    doi = "10.18653/v1/2025.bea-1.9",
    pages = "118--128",
    ISBN = "979-8-89176-270-1",
    abstract = "Decoder-only large language models have shown superior performance in the fluency-edit English Grammatical Error Correction, but their adaptation for minimal-edit English GEC is still underexplored. To improve their effectiveness in the minimal-edit approach, we explore the error rate adaptation topic and propose a novel training schedule method. Our experiments set a new state-of-the-art result for a single-model system on the BEA-test set. We also detokenize the most common English GEC datasets to match the natural way of writing text. During the process, we find that there are errors in them. Our experiments analyze whether training on detokenized datasets impacts the results and measure the impact of the usage of the datasets with corrected erroneous examples. To facilitate reproducibility, we have released the source code used to train our models."
}Markdown (Informal)
[Adapting LLMs for Minimal-edit Grammatical Error Correction](https://preview.aclanthology.org/ingest-emnlp/2025.bea-1.9/) (Staruch et al., BEA 2025)
ACL
- Ryszard Staruch, Filip Gralinski, and Daniel Dzienisiewicz. 2025. Adapting LLMs for Minimal-edit Grammatical Error Correction. In Proceedings of the 20th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2025), pages 118–128, Vienna, Austria. Association for Computational Linguistics.