Abstract
Causal language models such as the GPT series have achieved significant success across various domains. However, their application to the lexical substitution task (LST) remains largely unexplored due to inherent limitations in autoregressive decoding. Our work is motivated by our observation that existing LST approaches tend to suffer from a misalignment between the pre-training objectives of the language models that they employ, and their subsequent fine-tuning and application for substitute generation. We introduce PromptSub, the first system to use causal language modeling (CLM) for LST. Through prompt-aware fine-tuning, PromptSub not only enriches the given context with additional knowledge, but also leverages the unidirectional nature of autoregressive decoding. PromptSub consistently outperforms GeneSis, the best previously published supervised LST method. Further analysis demonstrates the potential of PromptSub to further benefit from increased model capacity, expanded data resources, and retrieval of external knowledge. By framing LST within the paradigm of CLM, our approach indicates the versatility of general CLM-based systems, such as ChatGPT, in catering to specialized tasks, including LST.- Anthology ID:
- 2024.starsem-1.10
- Volume:
- Proceedings of the 13th Joint Conference on Lexical and Computational Semantics (*SEM 2024)
- Month:
- June
- Year:
- 2024
- Address:
- Mexico City, Mexico
- Editors:
- Danushka Bollegala, Vered Shwartz
- Venue:
- *SEM
- SIG:
- SIGLEX
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 120–132
- Language:
- URL:
- https://aclanthology.org/2024.starsem-1.10
- DOI:
- Cite (ACL):
- Ning Shi, Bradley Hauer, and Grzegorz Kondrak. 2024. Lexical Substitution as Causal Language Modeling. In Proceedings of the 13th Joint Conference on Lexical and Computational Semantics (*SEM 2024), pages 120–132, Mexico City, Mexico. Association for Computational Linguistics.
- Cite (Informal):
- Lexical Substitution as Causal Language Modeling (Shi et al., *SEM 2024)
- PDF:
- https://preview.aclanthology.org/jeptaln-2024-ingestion/2024.starsem-1.10.pdf