Context is Key: Grammatical Error Detection with Contextual Word Representations

Samuel Bell, Helen Yannakoudakis, Marek Rei


Abstract
Grammatical error detection (GED) in non-native writing requires systems to identify a wide range of errors in text written by language learners. Error detection as a purely supervised task can be challenging, as GED datasets are limited in size and the label distributions are highly imbalanced. Contextualized word representations offer a possible solution, as they can efficiently capture compositional information in language and can be optimized on large amounts of unsupervised data. In this paper, we perform a systematic comparison of ELMo, BERT and Flair embeddings (Peters et al., 2017; Devlin et al., 2018; Akbik et al., 2018) on a range of public GED datasets, and propose an approach to effectively integrate such representations in current methods, achieving a new state of the art on GED. We further analyze the strengths and weaknesses of different contextual embeddings for the task at hand, and present detailed analyses of their impact on different types of errors.
Anthology ID:
W19-4410
Volume:
Proceedings of the Fourteenth Workshop on Innovative Use of NLP for Building Educational Applications
Month:
August
Year:
2019
Address:
Florence, Italy
Editors:
Helen Yannakoudakis, Ekaterina Kochmar, Claudia Leacock, Nitin Madnani, Ildikó Pilán, Torsten Zesch
Venue:
BEA
SIG:
SIGEDU
Publisher:
Association for Computational Linguistics
Note:
Pages:
103–115
Language:
URL:
https://aclanthology.org/W19-4410
DOI:
10.18653/v1/W19-4410
Bibkey:
Cite (ACL):
Samuel Bell, Helen Yannakoudakis, and Marek Rei. 2019. Context is Key: Grammatical Error Detection with Contextual Word Representations. In Proceedings of the Fourteenth Workshop on Innovative Use of NLP for Building Educational Applications, pages 103–115, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Context is Key: Grammatical Error Detection with Contextual Word Representations (Bell et al., BEA 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/W19-4410.pdf
Code
 samueljamesbell/sequence-labeler
Data
Billion Word BenchmarkFCEJFLEGOne Billion Word Benchmark