Rubric Reliability and Annotation of Content and Argument in Source-Based Argument Essays
Yanjun Gao, Alex Driban, Brennan Xavier McManus, Elena Musi, Patricia Davies, Smaranda Muresan, Rebecca J. Passonneau
Abstract
We present a unique dataset of student source-based argument essays to facilitate research on the relations between content, argumentation skills, and assessment. Two classroom writing assignments were given to college students in a STEM major, accompanied by a carefully designed rubric. The paper presents a reliability study of the rubric, showing it to be highly reliable, and initial annotation on content and argumentation annotation of the essays.- Anthology ID:
- W19-4452
- Volume:
- Proceedings of the Fourteenth Workshop on Innovative Use of NLP for Building Educational Applications
- Month:
- August
- Year:
- 2019
- Address:
- Florence, Italy
- Editors:
- Helen Yannakoudakis, Ekaterina Kochmar, Claudia Leacock, Nitin Madnani, Ildikó Pilán, Torsten Zesch
- Venue:
- BEA
- SIG:
- SIGEDU
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 507–518
- Language:
- URL:
- https://aclanthology.org/W19-4452
- DOI:
- 10.18653/v1/W19-4452
- Cite (ACL):
- Yanjun Gao, Alex Driban, Brennan Xavier McManus, Elena Musi, Patricia Davies, Smaranda Muresan, and Rebecca J. Passonneau. 2019. Rubric Reliability and Annotation of Content and Argument in Source-Based Argument Essays. In Proceedings of the Fourteenth Workshop on Innovative Use of NLP for Building Educational Applications, pages 507–518, Florence, Italy. Association for Computational Linguistics.
- Cite (Informal):
- Rubric Reliability and Annotation of Content and Argument in Source-Based Argument Essays (Gao et al., BEA 2019)
- PDF:
- https://preview.aclanthology.org/emnlp22-frontmatter/W19-4452.pdf
- Code
- psunlpgroup/SEAView