ARTS: Assessing Readability & Text Simplicity

Björn Engelmann, Christin Katharina Kreutz, Fabian Haak, Philipp Schaer


Abstract
Automatic text simplification aims to reduce a text’s complexity. Its evaluation should quantify how easy it is to understand a text. Datasets with simplicity labels on text level are a prerequisite for developing such evaluation approaches. However, current publicly available datasets do not align with this, as they mainly treat text simplification as a relational concept (“How much simpler has this text gotten compared to the original version?”) or assign discrete readability levels.This work alleviates the problem of Assessing Readability & Text Simplicity. We present ARTS, a method for language-independent construction of datasets for simplicity assessment. We propose using pairwise comparisons of texts in conjunction with an Elo algorithm to produce a simplicity ranking and simplicity scores. Additionally, we provide a high-quality human-labeled and three GPT-labeled simplicity datasets. Our results show a high correlation between human and LLM-based labels, allowing for an effective and cost-efficient way to construct large synthetic datasets.
Anthology ID:
2024.findings-emnlp.877
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14925–14942
Language:
URL:
https://preview.aclanthology.org/add-emnlp-2024-awards/2024.findings-emnlp.877/
DOI:
10.18653/v1/2024.findings-emnlp.877
Bibkey:
Cite (ACL):
Björn Engelmann, Christin Katharina Kreutz, Fabian Haak, and Philipp Schaer. 2024. ARTS: Assessing Readability & Text Simplicity. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 14925–14942, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
ARTS: Assessing Readability & Text Simplicity (Engelmann et al., Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/add-emnlp-2024-awards/2024.findings-emnlp.877.pdf
Software:
 2024.findings-emnlp.877.software.zip
Data:
 2024.findings-emnlp.877.data.zip