George Narroway


2012

pdf
HOO 2012: A Report on the Preposition and Determiner Error Correction Shared Task
Robert Dale | Ilya Anisimoff | George Narroway
Proceedings of the Seventh Workshop on Building Educational Applications Using NLP

pdf
A Framework for Evaluating Text Correction
Robert Dale | George Narroway
Proceedings of the Eighth International Conference on Language Resources and Evaluation (LREC'12)

Computer-based aids for writing assistance have been around since at least the early 1980s, focussing primarily on aspects such as spelling, grammar and style. The potential audience for such tools is very large indeed, and this is a clear case where we might expect to see language processing applications having a significant real-world impact. However, existing comparative evaluations of applications in this space are often no more than impressionistic and anecdotal reviews of commercial offerings as found in software magazines, making it hard to determine which approaches are superior. More rigorous evaluation in the scholarly literature has been held back in particular by the absence of shared datasets of texts marked-up with errors, and the lack of an agreed evaluation framework. Significant collections of publicly available data are now appearing; this paper describes a complementary evaluation framework, which has been piloted in the Helping Our Own shared task. The approach, which uses stand-off annotations for representing edits to text, can be used in a wide variety of text-correction tasks, and easily accommodates different error tagsets.