Argument Similarity Assessment in German for Intelligent Tutoring: Crowdsourced Dataset and First Experiments

Xiaoyu Bai, Manfred Stede


Abstract
NLP technologies such as text similarity assessment, question answering and text classification are increasingly being used to develop intelligent educational applications. The long-term goal of our work is an intelligent tutoring system for German secondary schools, which will support students in a school exercise that requires them to identify arguments in an argumentative source text. The present paper presents our work on a central subtask, viz. the automatic assessment of similarity between a pair of argumentative text snippets in German. In the designated use case, students write out key arguments from a given source text; the tutoring system then evaluates them against a target reference, assessing the similarity level between student work and the reference. We collect a dataset for our similarity assessment task through crowdsourcing as authentic German student data are scarce; we label the collected text pairs with similarity scores on a 5-point scale and run first experiments on the task. We see that a model based on BERT shows promising results, while we also discuss some challenges that we observe.
Anthology ID:
2022.lrec-1.234
Volume:
Proceedings of the Thirteenth Language Resources and Evaluation Conference
Month:
June
Year:
2022
Address:
Marseille, France
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
2177–2187
Language:
URL:
https://aclanthology.org/2022.lrec-1.234
DOI:
Bibkey:
Cite (ACL):
Xiaoyu Bai and Manfred Stede. 2022. Argument Similarity Assessment in German for Intelligent Tutoring: Crowdsourced Dataset and First Experiments. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 2177–2187, Marseille, France. European Language Resources Association.
Cite (Informal):
Argument Similarity Assessment in German for Intelligent Tutoring: Crowdsourced Dataset and First Experiments (Bai & Stede, LREC 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.lrec-1.234.pdf