GRS: Combining Generation and Revision in Unsupervised Sentence Simplification

Mohammad Dehghan, Dhruv Kumar, Lukasz Golab


Abstract
We propose GRS: an unsupervised approach to sentence simplification that combines text generation and text revision. We start with an iterative framework in which an input sentence is revised using explicit edit operations, and add paraphrasing as a new edit operation. This allows us to combine the advantages of generative and revision-based approaches: paraphrasing captures complex edit operations, and the use of explicit edit operations in an iterative manner provides controllability and interpretability. We demonstrate these advantages of GRS compared to existing methods on the Newsela and ASSET datasets.
Anthology ID:
2022.findings-acl.77
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
949–960
Language:
URL:
https://aclanthology.org/2022.findings-acl.77
DOI:
10.18653/v1/2022.findings-acl.77
Bibkey:
Cite (ACL):
Mohammad Dehghan, Dhruv Kumar, and Lukasz Golab. 2022. GRS: Combining Generation and Revision in Unsupervised Sentence Simplification. In Findings of the Association for Computational Linguistics: ACL 2022, pages 949–960, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
GRS: Combining Generation and Revision in Unsupervised Sentence Simplification (Dehghan et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.findings-acl.77.pdf
Video:
 https://preview.aclanthology.org/ingestion-script-update/2022.findings-acl.77.mp4
Code
 imohammad12/grs
Data
ASSETCoLANewsela