Sentence-level Privacy for Document Embeddings

Casey Meehan, Khalil Mrini, Kamalika Chaudhuri


Abstract
User language data can contain highly sensitive personal content. As such, it is imperative to offer users a strong and interpretable privacy guarantee when learning from their data. In this work we propose SentDP, pure local differential privacy at the sentence level for a single user document. We propose a novel technique, DeepCandidate, that combines concepts from robust statistics and language modeling to produce high (768) dimensional, general 𝜖-SentDP document embeddings. This guarantees that any single sentence in a document can be substituted with any other sentence while keeping the embedding 𝜖-indistinguishable. Our experiments indicate that these private document embeddings are useful for downstream tasks like sentiment analysis and topic classification and even outperform baseline methods with weaker guarantees like word-level Metric DP.
Anthology ID:
2022.acl-long.238
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3367–3380
Language:
URL:
https://aclanthology.org/2022.acl-long.238
DOI:
10.18653/v1/2022.acl-long.238
Bibkey:
Cite (ACL):
Casey Meehan, Khalil Mrini, and Kamalika Chaudhuri. 2022. Sentence-level Privacy for Document Embeddings. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3367–3380, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Sentence-level Privacy for Document Embeddings (Meehan et al., ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nodalida-main-page/2022.acl-long.238.pdf
Video:
 https://preview.aclanthology.org/nodalida-main-page/2022.acl-long.238.mp4
Data
IMDb Movie Reviews