Cognitive Simplification Operations Improve Text Simplification

Eytan Chamovitz, Omri Abend


Abstract
Text Simplification (TS) is the task of converting a text into a form that is easier to read while maintaining the meaning of the original text. A sub-task of TS is Cognitive Simplification (CS), converting text to a form that is readily understood by people with cognitive disabilities without rendering it childish or simplistic. This sub-task has yet to be explored with neural methods in NLP, and resources for it are scarcely available. In this paper, we present a method for incorporating knowledge from the cognitive accessibility domain into a TS model, by introducing an inductive bias regarding what simplification operations to use. We show that by adding this inductive bias to a TS-trained model, it is able to adapt better to CS without ever seeing CS data, and outperform a baseline model on a traditional TS benchmark. In addition, we provide a novel test dataset for CS, and analyze the differences between CS corpora and existing TS corpora, in terms of how simplification operations are applied.
Anthology ID:
2022.conll-1.17
Volume:
Proceedings of the 26th Conference on Computational Natural Language Learning (CoNLL)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Hybrid)
Venue:
CoNLL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
241–265
Language:
URL:
https://aclanthology.org/2022.conll-1.17
DOI:
Bibkey:
Cite (ACL):
Eytan Chamovitz and Omri Abend. 2022. Cognitive Simplification Operations Improve Text Simplification. In Proceedings of the 26th Conference on Computational Natural Language Learning (CoNLL), pages 241–265, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
Cite (Informal):
Cognitive Simplification Operations Improve Text Simplification (Chamovitz & Abend, CoNLL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-url/2022.conll-1.17.pdf
Data:
 2022.conll-1.17.data.zip