Rares-Alexandru Roscan


2025

pdf bib
Archaeology at TSAR 2025 Shared Task Teaching Small Models to do CEFR Simplifications
Rares-Alexandru Roscan | Sergiu Nisioi
Proceedings of the Fourth Workshop on Text Simplification, Accessibility and Readability (TSAR 2025)

Large language models (LLMs) have demonstrated strong performance in text simplification tasks, but their high computational cost and proprietary nature often limit practical use, especially in education. We explore open-source LLMs for CEFR-level text simplification. By reducing model size and computational requirements, our approach enables greater accessibility and deployment in educational environments. Our results show some of the lowest error rates in producing CEFR-compliant texts at TSAR 2025, using models with 8 billion and 1 billion parameters. Such approaches have the potential to democratize NLP technologies for real-world applications.