LLMs are Privacy Erasable

Zipeng Ye, Wenjian Luo


Abstract
The capabilities of large language models (LLMs) are advancing at an remarkable pace, along with a surge in cloud services that are powered by LLMs. Their convenience has gradually transformed the routines people work. However, for services such as document summarizing, editing, and so on, users need to upload relevant files or context to obtain the desired services, which may inadvertently expose their privacy. This paper aims to address the challenging balance between the convenience of LLMs services and user privacy concerns. Specifically, based on the structural and functional characteristics of LLMs, we have developed a strategy that safeguards user prompt while accessing LLM cloud services, even in scenarios where advanced reconstruction attacks are adopted. We comprehensively evaluate the efficacy of our method across prominent LLM benchmarks. The empirical results show that our method not only effectively thwarts reconstruction attacks but also, in certain tasks, even improves model performance, surpassing the outcomes reported in official model cards.
Anthology ID:
2025.findings-emnlp.304
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5672–5692
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.304/
DOI:
10.18653/v1/2025.findings-emnlp.304
Bibkey:
Cite (ACL):
Zipeng Ye and Wenjian Luo. 2025. LLMs are Privacy Erasable. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 5672–5692, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
LLMs are Privacy Erasable (Ye & Luo, Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.304.pdf
Checklist:
 2025.findings-emnlp.304.checklist.pdf