Neuron Activation Modulation for Text Style Transfer: Guiding Large Language Models

Chaona Kong, Jianyi Liu, Yifan Tang, Ru Zhang


Abstract
Text style transfer (TST) aims to flexibly adjust the style of text while preserving its core content. Although large language models (LLMs) excel in TST tasks, they often face unidirectional issues due to imbalanced training data and their tendency to generate safer responses. These challenges present a significant obstacle in achieving effective style transfer. To address this issue, we propose a novel method for text style transfer based on neuron activation modulation (NAM-TST). This approach identifies neurons related to style through gradient-based activation difference analysis and calculates the activation differences between the source and target styles. During text generation, we use the activation difference to align the activation values of style-related neurons with those of the target style to guide the model in performing the transfer. This strategy enables the model to generate text that satisfies specific style requirements, effectively mitigating the unidirectional issue inherent in LLMs during style transfer. Experiments on benchmark datasets demonstrate that NAM-TST significantly enhances style transfer quality while preserving content consistency.
Anthology ID:
2025.findings-acl.403
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7735–7747
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.findings-acl.403/
DOI:
Bibkey:
Cite (ACL):
Chaona Kong, Jianyi Liu, Yifan Tang, and Ru Zhang. 2025. Neuron Activation Modulation for Text Style Transfer: Guiding Large Language Models. In Findings of the Association for Computational Linguistics: ACL 2025, pages 7735–7747, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Neuron Activation Modulation for Text Style Transfer: Guiding Large Language Models (Kong et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.findings-acl.403.pdf