Abstract
Natural language processing (NLP) models have become increasingly popular in real-world applications, such as text classification. However, they are vulnerable to privacy attacks, including data reconstruction attacks that aim to extract the data used to train the model. Most previous studies on data reconstruction attacks have focused on LLM, while classification models were assumed to be more secure. In this work, we propose a new targeted data reconstruction attack called the Mix And Match attack, which takes advantage of the fact that most classification models are based on LLM. The Mix And Match attack uses the base model of the target model to generate candidate tokens and then prunes them using the classification head. We extensively demonstrate the effectiveness of the attack using both random and organic canaries. This work highlights the importance of considering the privacy risks associated with data reconstruction attacks in classification models and offers insights into possible leakages.- Anthology ID:
- 2024.privatenlp-1.15
- Volume:
- Proceedings of the Fifth Workshop on Privacy in Natural Language Processing
- Month:
- August
- Year:
- 2024
- Address:
- Bangkok, Thailand
- Editors:
- Ivan Habernal, Sepideh Ghanavati, Abhilasha Ravichander, Vijayanta Jain, Patricia Thaine, Timour Igamberdiev, Niloofar Mireshghallah, Oluwaseyi Feyisetan
- Venues:
- PrivateNLP | WS
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 143–158
- Language:
- URL:
- https://preview.aclanthology.org/add_missing_videos/2024.privatenlp-1.15/
- DOI:
- Cite (ACL):
- Adel Elmahdy and Ahmed Salem. 2024. Deconstructing Classifiers: Towards A Data Reconstruction Attack Against Text Classification Models. In Proceedings of the Fifth Workshop on Privacy in Natural Language Processing, pages 143–158, Bangkok, Thailand. Association for Computational Linguistics.
- Cite (Informal):
- Deconstructing Classifiers: Towards A Data Reconstruction Attack Against Text Classification Models (Elmahdy & Salem, PrivateNLP 2024)
- PDF:
- https://preview.aclanthology.org/add_missing_videos/2024.privatenlp-1.15.pdf