Unsupervised Sentence Readability Estimation Based on Parallel Corpora for Text Simplification
Rina Miyata, Toru Urakawa, Hideaki Tamori, Tomoyuki Kajiwara
Abstract
We train a relative sentence readability estimator from a corpus without absolute sentence readability.Since sentence readability depends on the reader’s knowledge, objective and absolute readability assessments require costly annotation by experts.Therefore, few corpora have absolute sentence readability, while parallel corpora for text simplification with relative sentence readability between two sentences are available for many languages.With multilingual applications in mind, we propose a method to estimate relative sentence readability based on parallel corpora for text simplification.Experimental results on ranking a set of English sentences by readability show that our method outperforms existing unsupervised methods and is comparable to supervised methods based on absolute sentence readability.- Anthology ID:
- 2025.bea-1.36
- Volume:
- Proceedings of the 20th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2025)
- Month:
- July
- Year:
- 2025
- Address:
- Vienna, Austria
- Editors:
- Ekaterina Kochmar, Bashar Alhafni, Marie Bexte, Jill Burstein, Andrea Horbach, Ronja Laarmann-Quante, Anaïs Tack, Victoria Yaneva, Zheng Yuan
- Venues:
- BEA | WS
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 499–504
- Language:
- URL:
- https://preview.aclanthology.org/acl25-workshop-ingestion/2025.bea-1.36/
- DOI:
- Cite (ACL):
- Rina Miyata, Toru Urakawa, Hideaki Tamori, and Tomoyuki Kajiwara. 2025. Unsupervised Sentence Readability Estimation Based on Parallel Corpora for Text Simplification. In Proceedings of the 20th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2025), pages 499–504, Vienna, Austria. Association for Computational Linguistics.
- Cite (Informal):
- Unsupervised Sentence Readability Estimation Based on Parallel Corpora for Text Simplification (Miyata et al., BEA 2025)
- PDF:
- https://preview.aclanthology.org/acl25-workshop-ingestion/2025.bea-1.36.pdf