Exploring Neural Text Simplification Models

Sergiu Nisioi, Sanja Štajner, Simone Paolo Ponzetto, Liviu P. Dinu


Abstract
We present the first attempt at using sequence to sequence neural networks to model text simplification (TS). Unlike the previously proposed automated TS systems, our neural text simplification (NTS) systems are able to simultaneously perform lexical simplification and content reduction. An extensive human evaluation of the output has shown that NTS systems achieve almost perfect grammaticality and meaning preservation of output sentences and higher level of simplification than the state-of-the-art automated TS systems
Anthology ID:
P17-2014
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Editors:
Regina Barzilay, Min-Yen Kan
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
85–91
Language:
URL:
https://aclanthology.org/P17-2014
DOI:
10.18653/v1/P17-2014
Bibkey:
Cite (ACL):
Sergiu Nisioi, Sanja Štajner, Simone Paolo Ponzetto, and Liviu P. Dinu. 2017. Exploring Neural Text Simplification Models. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 85–91, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Exploring Neural Text Simplification Models (Nisioi et al., ACL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/P17-2014.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-5/P17-2014.mp4
Data
TurkCorpus