Iterative Edit-Based Unsupervised Sentence Simplification

Dhruv Kumar, Lili Mou, Lukasz Golab, Olga Vechtomova


Abstract
We present a novel iterative, edit-based approach to unsupervised sentence simplification. Our model is guided by a scoring function involving fluency, simplicity, and meaning preservation. Then, we iteratively perform word and phrase-level edits on the complex sentence. Compared with previous approaches, our model does not require a parallel training set, but is more controllable and interpretable. Experiments on Newsela and WikiLarge datasets show that our approach is nearly as effective as state-of-the-art supervised approaches.
Anthology ID:
2020.acl-main.707
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7918–7928
Language:
URL:
https://aclanthology.org/2020.acl-main.707
DOI:
10.18653/v1/2020.acl-main.707
Bibkey:
Cite (ACL):
Dhruv Kumar, Lili Mou, Lukasz Golab, and Olga Vechtomova. 2020. Iterative Edit-Based Unsupervised Sentence Simplification. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 7918–7928, Online. Association for Computational Linguistics.
Cite (Informal):
Iterative Edit-Based Unsupervised Sentence Simplification (Kumar et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2020.acl-main.707.pdf
Video:
 http://slideslive.com/38929280
Code
 ddhruvkr/Edit-Unsup-TS
Data
NewselaTurkCorpusWikiLarge