REFeREE: A REference-FREE Model-Based Metric for Text Simplification

Yichen Huang, Ekaterina Kochmar


Abstract
Text simplification lacks a universal standard of quality, and annotated reference simplifications are scarce and costly. We propose to alleviate such limitations by introducing REFeREE, a reference-free model-based metric with a 3-stage curriculum. REFeREE leverages an arbitrarily scalable pretraining stage and can be applied to any quality standard as long as a small number of human annotations are available. Our experiments show that our metric outperforms existing reference-based metrics in predicting overall ratings and reaches competitive and consistent performance in predicting specific ratings while requiring no reference simplifications at inference time.
Anthology ID:
2024.lrec-main.1200
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
13740–13753
Language:
URL:
https://aclanthology.org/2024.lrec-main.1200
DOI:
Bibkey:
Cite (ACL):
Yichen Huang and Ekaterina Kochmar. 2024. REFeREE: A REference-FREE Model-Based Metric for Text Simplification. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 13740–13753, Torino, Italia. ELRA and ICCL.
Cite (Informal):
REFeREE: A REference-FREE Model-Based Metric for Text Simplification (Huang & Kochmar, LREC-COLING 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2024.lrec-main.1200.pdf