Abstract
We propose GRS: an unsupervised approach to sentence simplification that combines text generation and text revision. We start with an iterative framework in which an input sentence is revised using explicit edit operations, and add paraphrasing as a new edit operation. This allows us to combine the advantages of generative and revision-based approaches: paraphrasing captures complex edit operations, and the use of explicit edit operations in an iterative manner provides controllability and interpretability. We demonstrate these advantages of GRS compared to existing methods on the Newsela and ASSET datasets.- Anthology ID:
- 2022.findings-acl.77
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2022
- Month:
- May
- Year:
- 2022
- Address:
- Dublin, Ireland
- Editors:
- Smaranda Muresan, Preslav Nakov, Aline Villavicencio
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 949–960
- Language:
- URL:
- https://aclanthology.org/2022.findings-acl.77
- DOI:
- 10.18653/v1/2022.findings-acl.77
- Cite (ACL):
- Mohammad Dehghan, Dhruv Kumar, and Lukasz Golab. 2022. GRS: Combining Generation and Revision in Unsupervised Sentence Simplification. In Findings of the Association for Computational Linguistics: ACL 2022, pages 949–960, Dublin, Ireland. Association for Computational Linguistics.
- Cite (Informal):
- GRS: Combining Generation and Revision in Unsupervised Sentence Simplification (Dehghan et al., Findings 2022)
- PDF:
- https://preview.aclanthology.org/naacl24-info/2022.findings-acl.77.pdf
- Code
- imohammad12/grs
- Data
- ASSET, CoLA, Newsela