TUNI: A Textual Unimodal Detector for Identity Inference in CLIP Models

Songze Li, Ruoxi Cheng, Xiaojun Jia


Abstract
The widespread usage of large-scale multimodal models like CLIP has heightened concerns about the leakage of PII. Existing methods for identity inference in CLIP models require querying the model with full PII, including textual descriptions of the person and corresponding images (e.g., the name and the face photo of the person). However, applying images may risk exposing personal information to target models, as the image might not have been previously encountered by the target model.Additionally, previous MIAs train shadow models to mimic the behaviors of the target model, which incurs high computational costs, especially for large CLIP models. To address these challenges, we propose a textual unimodal detector (TUNI) in CLIP models, a novel technique for identity inference that: 1) only utilizes text data to query the target model; and 2) eliminates the need for training shadow models. Extensive experiments of TUNI across various CLIP model architectures and datasets demonstrate its superior performance over baselines, albeit with only text data.
Anthology ID:
2025.privatenlp-main.1
Volume:
Proceedings of the Sixth Workshop on Privacy in Natural Language Processing
Month:
April
Year:
2025
Address:
Albuquerque, New Mexico
Editors:
Ivan Habernal, Sepideh Ghanavati, Vijayanta Jain, Timour Igamberdiev, Shomir Wilson
Venues:
PrivateNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–13
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2025.privatenlp-main.1/
DOI:
Bibkey:
Cite (ACL):
Songze Li, Ruoxi Cheng, and Xiaojun Jia. 2025. TUNI: A Textual Unimodal Detector for Identity Inference in CLIP Models. In Proceedings of the Sixth Workshop on Privacy in Natural Language Processing, pages 1–13, Albuquerque, New Mexico. Association for Computational Linguistics.
Cite (Informal):
TUNI: A Textual Unimodal Detector for Identity Inference in CLIP Models (Li et al., PrivateNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2025.privatenlp-main.1.pdf