A Language Model based Evaluator for Sentence Compression

Yang Zhao, Zhiyuan Luo, Akiko Aizawa


Abstract
We herein present a language-model-based evaluator for deletion-based sentence compression and view this task as a series of deletion-and-evaluation operations using the evaluator. More specifically, the evaluator is a syntactic neural language model that is first built by learning the syntactic and structural collocation among words. Subsequently, a series of trial-and-error deletion operations are conducted on the source sentences via a reinforcement learning framework to obtain the best target compression. An empirical study shows that the proposed model can effectively generate more readable compression, comparable or superior to several strong baselines. Furthermore, we introduce a 200-sentence test set for a large-scale dataset, setting a new baseline for the future research.
Anthology ID:
P18-2028
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Iryna Gurevych, Yusuke Miyao
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
170–175
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/P18-2028/
DOI:
10.18653/v1/P18-2028
Bibkey:
Cite (ACL):
Yang Zhao, Zhiyuan Luo, and Akiko Aizawa. 2018. A Language Model based Evaluator for Sentence Compression. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 170–175, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
A Language Model based Evaluator for Sentence Compression (Zhao et al., ACL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/P18-2028.pdf
Note:
 P18-2028.Notes.pdf
Poster:
 P18-2028.Poster.pdf
Data
GoogleSentence Compression