Contrastive Perplexity for Controlled Generation: An Application in Detoxifying Large Language Models

Tassilo Klein, Moin Nabi


Abstract
The generation of toxic content by large language models (LLMs) remains a critical challenge for the safe deployment of language technology. We propose a novel framework for implicit knowledge editing and controlled text generation by fine-tuning LLMs with a prototype-based contrastive perplexity objective. Central to our method is the construction of hard negatives—toxic outputs that are generated through adversarial paraphrasing to be semantically similar and model probability to their non-toxic counterparts. By training on these challenging and realistic pairs, our approach ensures robust and stable contrastive optimization. Experimental results in the domain of detoxification demonstrate that our method significantly reduces toxic generation while maintaining strong performance on downstream tasks such as commonsense reasoning and reading comprehension. Our findings highlight the effectiveness of exploiting hard negatives for attribute-aware fine-tuning.
Anthology ID:
2025.acl-long.125
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2493–2508
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.125/
DOI:
Bibkey:
Cite (ACL):
Tassilo Klein and Moin Nabi. 2025. Contrastive Perplexity for Controlled Generation: An Application in Detoxifying Large Language Models. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2493–2508, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Contrastive Perplexity for Controlled Generation: An Application in Detoxifying Large Language Models (Klein & Nabi, ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.125.pdf