Abstract
Grammatical error correction (GEC) is one of the areas in natural language processing in which purely neural models have not yet superseded more traditional symbolic models. Hybrid systems combining phrase-based statistical machine translation (SMT) and neural sequence models are currently among the most effective approaches to GEC. However, both SMT and neural sequence-to-sequence models require large amounts of annotated data. Language model based GEC (LM-GEC) is a promising alternative which does not rely on annotated training data. We show how to improve LM-GEC by applying modelling techniques based on finite state transducers. We report further gains by rescoring with neural language models. We show that our methods developed for LM-GEC can also be used with SMT systems if annotated training data is available. Our best system outperforms the best published result on the CoNLL-2014 test set, and achieves far better relative improvements over the SMT baselines than previous hybrid systems.- Anthology ID:
- N19-1406
- Volume:
- Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
- Month:
- June
- Year:
- 2019
- Address:
- Minneapolis, Minnesota
- Editors:
- Jill Burstein, Christy Doran, Thamar Solorio
- Venue:
- NAACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 4033–4039
- Language:
- URL:
- https://aclanthology.org/N19-1406
- DOI:
- 10.18653/v1/N19-1406
- Cite (ACL):
- Felix Stahlberg, Christopher Bryant, and Bill Byrne. 2019. Neural Grammatical Error Correction with Finite State Transducers. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 4033–4039, Minneapolis, Minnesota. Association for Computational Linguistics.
- Cite (Informal):
- Neural Grammatical Error Correction with Finite State Transducers (Stahlberg et al., NAACL 2019)
- PDF:
- https://preview.aclanthology.org/fix-dup-bibkey/N19-1406.pdf
- Data
- Billion Word Benchmark, JFLEG, One Billion Word Benchmark