Document-Level Text Simplification: Dataset, Criteria and Baseline

Renliang Sun, Hanqi Jin, Xiaojun Wan


Abstract
Text simplification is a valuable technique. However, current research is limited to sentence simplification. In this paper, we define and investigate a new task of document-level text simplification, which aims to simplify a document consisting of multiple sentences. Based on Wikipedia dumps, we first construct a large-scale dataset named D-Wikipedia and perform analysis and human evaluation on it to show that the dataset is reliable. Then, we propose a new automatic evaluation metric called D-SARI that is more suitable for the document-level simplification task. Finally, we select several representative models as baseline models for this task and perform automatic evaluation and human evaluation. We analyze the results and point out the shortcomings of the baseline models.
Anthology ID:
2021.emnlp-main.630
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7997–8013
Language:
URL:
https://aclanthology.org/2021.emnlp-main.630
DOI:
10.18653/v1/2021.emnlp-main.630
Bibkey:
Cite (ACL):
Renliang Sun, Hanqi Jin, and Xiaojun Wan. 2021. Document-Level Text Simplification: Dataset, Criteria and Baseline. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 7997–8013, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Document-Level Text Simplification: Dataset, Criteria and Baseline (Sun et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2021.emnlp-main.630.pdf
Video:
 https://preview.aclanthology.org/improve-issue-templates/2021.emnlp-main.630.mp4
Code
 rlsnlp/document-level-text-simplification
Data
NewselaWikiLarge