CoCo: A Tool for Automatically Assessing Conceptual Complexity of Texts

Sanja Stajner, Sergiu Nisioi, Ioana Hulpuș


Abstract
Traditional text complexity assessment usually takes into account only syntactic and lexical text complexity. The task of automatic assessment of conceptual text complexity, important for maintaining reader’s interest and text adaptation for struggling readers, has only been proposed recently. In this paper, we present CoCo - a tool for automatic assessment of conceptual text complexity, based on using the current state-of-the-art unsupervised approach. We make the code and API freely available for research purposes, and describe the code and the possibility for its personalization and adaptation in details. We compare the current implementation with the state of the art, discussing the influence of the choice of entity linker on the performances of the tool. Finally, we present results obtained on two widely used text simplification corpora, discussing the full potential of the tool.
Anthology ID:
2020.lrec-1.887
Volume:
Proceedings of the Twelfth Language Resources and Evaluation Conference
Month:
May
Year:
2020
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Asuncion Moreno, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
7179–7186
Language:
English
URL:
https://aclanthology.org/2020.lrec-1.887
DOI:
Bibkey:
Cite (ACL):
Sanja Stajner, Sergiu Nisioi, and Ioana Hulpuș. 2020. CoCo: A Tool for Automatically Assessing Conceptual Complexity of Texts. In Proceedings of the Twelfth Language Resources and Evaluation Conference, pages 7179–7186, Marseille, France. European Language Resources Association.
Cite (Informal):
CoCo: A Tool for Automatically Assessing Conceptual Complexity of Texts (Stajner et al., LREC 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2020.lrec-1.887.pdf