Locally Differentially Private Document Generation Using Zero Shot Prompting

Saiteja Utpala, Sara Hooker, Pin-Yu Chen


Abstract
Numerous studies have highlighted the privacy risks associated with large language models. Our research offers a unique perspective by demonstrating that pretrained large language models can effectively contribute to privacy preservation. We propose a locally differentially private mechanism called DP-Prompt, which leverages the power of pretrained large language models and zero-shot prompting to counter author de-anonymization attacks while minimizing the impact on downstream utility. When DP-Prompt is used with a powerful language model like ChatGPT (gpt-3.5), we observe a notable reduction in the success rate of de-anonymization attacks, showing that it surpasses existing approaches by a considerable margin despite its simpler design. For instance, in the case of the IMDB dataset, DP-Prompt (with ChatGPT) perfectly recovers the clean sentiment F1 score while achieving a 46% reduction in author identification F1 score against static attackers and a 26% reduction against adaptive attackers. We conduct extensive experiments across six open-source large language models, ranging up to 7 billion parameters, to analyze various effects of the privacy-utility tradeoff.
Anthology ID:
2023.findings-emnlp.566
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8442–8457
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2023.findings-emnlp.566/
DOI:
10.18653/v1/2023.findings-emnlp.566
Bibkey:
Cite (ACL):
Saiteja Utpala, Sara Hooker, and Pin-Yu Chen. 2023. Locally Differentially Private Document Generation Using Zero Shot Prompting. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 8442–8457, Singapore. Association for Computational Linguistics.
Cite (Informal):
Locally Differentially Private Document Generation Using Zero Shot Prompting (Utpala et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2023.findings-emnlp.566.pdf