HULAT-UC3M at TSAR 2025 Shared Task A Prompt-Based Approach using Lightweight Language Models for Readability-Controlled Text Simplification

Jesus M. Sanchez-Gomez, Lourdes Moreno, Paloma Martínez, Marco Antonio Sanchez-Escudero


Abstract
This paper describes the participation of the HULAT-UC3M team in the TSAR 2025 Shared Task on Readability-Controlled Text Simplification. Our approach uses open and lightweight Large Language Models (LLMs) with different sizes, together with two strategies for prompt engineering. The proposed system has been tested on the trial data provided, and evaluated using the official metrics CEFR Compliance, Meaning Preservation, and Similarity to References. LLaMA 3 8B model with reinforced prompts was selected as our final proposal for submission, and ranking fourteenth according to the overall metric. Finally, we discuss the main challenges that we identified in developing our approach for this task.
Anthology ID:
2025.tsar-1.15
Volume:
Proceedings of the Fourth Workshop on Text Simplification, Accessibility and Readability (TSAR 2025)
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Matthew Shardlow, Fernando Alva-Manchego, Kai North, Regina Stodden, Horacio Saggion, Nouran Khallaf, Akio Hayakawa
Venues:
TSAR | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
183–192
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.tsar-1.15/
DOI:
Bibkey:
Cite (ACL):
Jesus M. Sanchez-Gomez, Lourdes Moreno, Paloma Martínez, and Marco Antonio Sanchez-Escudero. 2025. HULAT-UC3M at TSAR 2025 Shared Task A Prompt-Based Approach using Lightweight Language Models for Readability-Controlled Text Simplification. In Proceedings of the Fourth Workshop on Text Simplification, Accessibility and Readability (TSAR 2025), pages 183–192, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
HULAT-UC3M at TSAR 2025 Shared Task A Prompt-Based Approach using Lightweight Language Models for Readability-Controlled Text Simplification (Sanchez-Gomez et al., TSAR 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.tsar-1.15.pdf