Document-level Text Simplification with Coherence Evaluation

Laura Vásquez-Rodríguez, Matthew Shardlow, Piotr Przybyła, Sophia Ananiadou


Abstract
We present a coherence-aware evaluation of document-level Text Simplification (TS), an approach that has not been considered in TS so far. We improve current TS sentence-based models to support a multi-sentence setting and the implementation of a state-of-the-art neural coherence model for simplification quality assessment. We enhanced English sentence simplification neural models for document-level simplification using 136,113 paragraph-level samples from both the general and medical domains to generate multiple sentences. Additionally, we use document-level simplification, readability and coherence metrics for evaluation. Our contributions include the introduction of coherence assessment into simplification evaluation with the automatic evaluation of 34,052 simplifications, a fine-tuned state-of-the-art model for document-level simplification, a coherence-based analysis of our results and a human evaluation of 300 samples that demonstrates the challenges encountered when moving towards document-level simplification.
Anthology ID:
2023.tsar-1.9
Volume:
Proceedings of the Second Workshop on Text Simplification, Accessibility and Readability
Month:
September
Year:
2023
Address:
Varna, Bulgaria
Editors:
Sanja Štajner, Horacio Saggio, Matthew Shardlow, Fernando Alva-Manchego
Venues:
TSAR | WS
SIG:
Publisher:
INCOMA Ltd., Shoumen, Bulgaria
Note:
Pages:
85–101
Language:
URL:
https://aclanthology.org/2023.tsar-1.9
DOI:
Bibkey:
Cite (ACL):
Laura Vásquez-Rodríguez, Matthew Shardlow, Piotr Przybyła, and Sophia Ananiadou. 2023. Document-level Text Simplification with Coherence Evaluation. In Proceedings of the Second Workshop on Text Simplification, Accessibility and Readability, pages 85–101, Varna, Bulgaria. INCOMA Ltd., Shoumen, Bulgaria.
Cite (Informal):
Document-level Text Simplification with Coherence Evaluation (Vásquez-Rodríguez et al., TSAR-WS 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2023.tsar-1.9.pdf