GTA: Gated Toxicity Avoidance for LM Performance Preservation

Heegyu Kim, Hyunsouk Cho


Abstract
Caution: This paper includes offensive words that could potentially cause unpleasantness. The fast-paced evolution of generative language models such as GPT-4 has demonstrated outstanding results in various NLP generation tasks. However, due to the potential generation of offensive words related to race or gender, various Controllable Text Generation (CTG) methods have been proposed to mitigate the occurrence of harmful words. However, existing CTG methods not only reduce toxicity but also negatively impact several aspects of the language model’s generation performance, including topic consistency, grammar, and perplexity. This paper explores the limitations of previous methods and introduces a novel solution in the form of a simple Gated Toxicity Avoidance (GTA) that can be applied to any CTG method. We also evaluate the effectiveness of the proposed GTA by comparing it with state-of-the-art CTG methods across various datasets. Our findings reveal that gated toxicity avoidance efficiently achieves comparable levels of toxicity reduction to the original CTG methods while preserving the generation performance of the language model.
Anthology ID:
2023.findings-emnlp.983
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14747–14763
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.983
DOI:
10.18653/v1/2023.findings-emnlp.983
Bibkey:
Cite (ACL):
Heegyu Kim and Hyunsouk Cho. 2023. GTA: Gated Toxicity Avoidance for LM Performance Preservation. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 14747–14763, Singapore. Association for Computational Linguistics.
Cite (Informal):
GTA: Gated Toxicity Avoidance for LM Performance Preservation (Kim & Cho, Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2023.findings-emnlp.983.pdf