From Complex Word Identification to Substitution: Instruction-Tuned Language Models for Lexical Simplification

Tonghui Han, Xinru Zhang, Yaxin Bi, Maurice D. Mulvenna, Dongqiang Yang


Abstract
Lexical-level sentence simplification is essential for improving text accessibility, yet traditional methods often struggle to dynamically identify complex terms and generate contextually appropriate substitutions, resulting in limited generalization. While prompt-based approaches with large language models (LLMs) have shown strong performance and adaptability, they often lack interpretability and are prone to hallucinating. This study proposes a fine-tuning approach for mid-sized LLMs to emulate the lexical simplification pipeline. We transform complex word identification datasets into an instruction–response format to support instruction tuning. Experimental results show that our method substantially enhances complex word identification accuracy with reduced hallucinations while achieving competitive performance on lexical simplification benchmarks. Furthermore, we find that integrating fine-tuning with prompt engineering reduces dependency on manual prompt optimization, leading to a more efficient simplification framework.
Anthology ID:
2025.starsem-1.4
Volume:
Proceedings of the 14th Joint Conference on Lexical and Computational Semantics (*SEM 2025)
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Lea Frermann, Mark Stevenson
Venue:
*SEM
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
48–58
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.starsem-1.4/
DOI:
Bibkey:
Cite (ACL):
Tonghui Han, Xinru Zhang, Yaxin Bi, Maurice D. Mulvenna, and Dongqiang Yang. 2025. From Complex Word Identification to Substitution: Instruction-Tuned Language Models for Lexical Simplification. In Proceedings of the 14th Joint Conference on Lexical and Computational Semantics (*SEM 2025), pages 48–58, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
From Complex Word Identification to Substitution: Instruction-Tuned Language Models for Lexical Simplification (Han et al., *SEM 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.starsem-1.4.pdf