Reveal and Release: Iterative LLM Unlearning with Self-generated Data

Linxi Xie, Xin Teng, Shichang Ke, Hongyi Wen, Shenji Wan


Abstract
Large language model (LLM) unlearning has demonstrated effectiveness in removing the influence of undesirable data (also known as forget data). Existing approaches typically assume full access to the forget dataset, overlooking two key challenges: (1) Forget data is often privacy-sensitive, rare, or legally regulated, making it expensive or impractical to obtain (2) The distribution of available forget data may not align with how that information is represented within the model. To address these limitations, we propose a “Reveal-and-Release” method to unlearn with self-generated data, where we prompt the model to reveal what it knows using optimized instructions. To fully utilize the self-generated forget data, we propose an iterative unlearning framework, where we make incremental adjustments to the model’s weight space with parameter-efficient modules trained on the forget data. Experimental results demonstrate that our method balances the tradeoff between forget quality and utility preservation.
Anthology ID:
2025.findings-emnlp.1298
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
23887–23899
Language:
URL:
https://preview.aclanthology.org/name-variant-enfa-fane/2025.findings-emnlp.1298/
DOI:
10.18653/v1/2025.findings-emnlp.1298
Bibkey:
Cite (ACL):
Linxi Xie, Xin Teng, Shichang Ke, Hongyi Wen, and Shenji Wan. 2025. Reveal and Release: Iterative LLM Unlearning with Self-generated Data. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 23887–23899, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Reveal and Release: Iterative LLM Unlearning with Self-generated Data (Xie et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/name-variant-enfa-fane/2025.findings-emnlp.1298.pdf
Checklist:
 2025.findings-emnlp.1298.checklist.pdf