Text Detoxification: Data Efficiency, Semantic Preservation and Model Generalization

Jing Yu, Yibo Zhao, Jiapeng Zhu, Wenming Shao, Bo Pang, Zhao Zhang, Xiang Li


Abstract
The widespread dissemination of toxic content on social media poses a serious threat to both online environments and public discourse, highlighting the urgent need for detoxification methods that effectively remove toxicity while preserving the original semantics.However, existing approaches often struggle to simultaneously achieve strong detoxification performance, semantic preservation, and robustness to out-of-distribution data. Moreover, they typically rely on costly, manually annotated parallel corpora while showing poor data efficiency.To address these challenges, we propose GEM, a two-stage training framework that jointly optimizes Model Generalization, Data Efficiency, and Semantic Preservation.We first perform supervised fine-tuning on a small set of high-quality, filtered parallel data to establish a strong initialization. Then, we leverage unlabeled toxic inputs and a custom-designed reward model to train the LLM using Group Relative Policy Optimization.Experimental results demonstrate that our method effectively mitigates the trade-offs faced by previous work, achieving state-of-the-art performance with improved generalization and significantly reduced dependence on annotated data. Our code is available at https://github.com/allacnobug/Detoxification-of-Text.
Anthology ID:
2025.emnlp-main.1636
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
32160–32174
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1636/
DOI:
Bibkey:
Cite (ACL):
Jing Yu, Yibo Zhao, Jiapeng Zhu, Wenming Shao, Bo Pang, Zhao Zhang, and Xiang Li. 2025. Text Detoxification: Data Efficiency, Semantic Preservation and Model Generalization. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 32160–32174, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Text Detoxification: Data Efficiency, Semantic Preservation and Model Generalization (Yu et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1636.pdf
Checklist:
 2025.emnlp-main.1636.checklist.pdf