Abstract
Annotation errors that stem from various sources are usually unavoidable when performing large-scale annotation of linguistic data. In this paper, we evaluate the feasibility of using the Transformer model to detect various types of annotator errors in morphological data sets that contain inflected word forms. We evaluate our error detection model on four languages by introducing three different types of artificial errors in the data: (1) typographic errors, where single characters in the data are inserted, replaced, or deleted; (2) linguistic confusion errors where two inflected forms are systematically swapped; and (3) self-adversarial errors where the Transformer model itself is used to generate plausible-looking, but erroneous forms by retrieving high-scoring predictions from the search beam. Results show that the Transformer model can with perfect, or near-perfect recall detect errors in all three scenarios, even when significant amounts of the annotated data (5%-30%) are corrupted on all languages tested. Precision varies across the languages and types of errors, but is high enough that the model can be very effectively used to flag suspicious entries in large data sets for further scrutiny by human annotators.- Anthology ID:
- 2022.acl-short.19
- Volume:
- Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
- Month:
- May
- Year:
- 2022
- Address:
- Dublin, Ireland
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 166–174
- Language:
- URL:
- https://aclanthology.org/2022.acl-short.19
- DOI:
- 10.18653/v1/2022.acl-short.19
- Cite (ACL):
- Ling Liu and Mans Hulden. 2022. Detecting Annotation Errors in Morphological Data with the Transformer. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 166–174, Dublin, Ireland. Association for Computational Linguistics.
- Cite (Informal):
- Detecting Annotation Errors in Morphological Data with the Transformer (Liu & Hulden, ACL 2022)
- PDF:
- https://preview.aclanthology.org/nodalida-main-page/2022.acl-short.19.pdf