Towards Reference-free Text Simplification Evaluation with a BERT Siamese Network Architecture

Xinran Zhao, Esin Durmus, Dit-Yan Yeung


Abstract
Text simplification (TS) aims to modify sentences to make their both content and structure easier to understand. Traditional n-gram matching-based TS evaluation metrics heavily rely on the exact token match and human-annotated simplified sentences. In this paper, we present a novel neural-network-based reference-free TS metric BETS that leverages pre-trained contextualized language representation models and large-scale paraphrasing datasets to evaluate simplicity and meaning preservation. We show that our metric, without collecting any costly human simplification reference, correlates better than existing metrics with human judgments for the quality of both overall simplification (+7.7%) and its key aspects, i.e., comparative simplicity (+11.2%) and meaning preservation (+9.2%).
Anthology ID:
2023.findings-acl.838
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13250–13264
Language:
URL:
https://aclanthology.org/2023.findings-acl.838
DOI:
10.18653/v1/2023.findings-acl.838
Bibkey:
Cite (ACL):
Xinran Zhao, Esin Durmus, and Dit-Yan Yeung. 2023. Towards Reference-free Text Simplification Evaluation with a BERT Siamese Network Architecture. In Findings of the Association for Computational Linguistics: ACL 2023, pages 13250–13264, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Towards Reference-free Text Simplification Evaluation with a BERT Siamese Network Architecture (Zhao et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.findings-acl.838.pdf
Video:
 https://preview.aclanthology.org/dois-2013-emnlp/2023.findings-acl.838.mp4