Cross-Sentence Transformations in Text Simplification
Abstract
Current approaches to Text Simplification focus on simplifying sentences individually. However, certain simplification transformations span beyond single sentences (e.g. joining and re-ordering sentences). In this paper, we motivate the need for modelling the simplification task at the document level, and assess the performance of sequence-to-sequence neural models in this setup. We analyse parallel original-simplified documents created by professional editors and show that there are frequent rewriting transformations that are not restricted to sentence boundaries. We also propose strategies to automatically evaluate the performance of a simplification model on these cross-sentence transformations. Our experiments show the inability of standard sequence-to-sequence neural models to learn these transformations, and suggest directions towards document-level simplification.- Anthology ID:
- W19-3656
- Volume:
- Proceedings of the 2019 Workshop on Widening NLP
- Month:
- August
- Year:
- 2019
- Address:
- Florence, Italy
- Venue:
- WiNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 181–184
- Language:
- URL:
- https://aclanthology.org/W19-3656
- DOI:
- Cite (ACL):
- Fernando Alva-Manchego, Carolina Scarton, and Lucia Specia. 2019. Cross-Sentence Transformations in Text Simplification. In Proceedings of the 2019 Workshop on Widening NLP, pages 181–184, Florence, Italy. Association for Computational Linguistics.
- Cite (Informal):
- Cross-Sentence Transformations in Text Simplification (Alva-Manchego et al., WiNLP 2019)