OBLIVIATE: Robust and Practical Machine Unlearning for Large Language Models

Xiaoyu Xu, Minxin Du, Qingqing Ye, Haibo Hu


Abstract
Large language models (LLMs) trained over extensive corpora risk memorizing sensitive, copyrighted, or toxic content. To address this, we propose OBLIVIATE, a robust unlearning framework that removes targeted data while preserving model utility. The framework follows a structured process: extracting target tokens, building retain sets, and fine-tuning with a tailored loss function comprising three components—masking, distillation, and world fact. Using low-rank adapters (LoRA) ensures efficiency without compromising unlearning quality. We conduct experiments on multiple datasets, including Harry Potter series, WMDP, and TOFU, using a comprehensive suite of metrics: forget quality (via a new document-level memorization score), model utility, and fluency. Results demonstrate its effectiveness in resisting membership inference attacks, minimizing the impact on retained data, and maintaining robustness across diverse scenarios.
Anthology ID:
2025.emnlp-main.183
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3696–3715
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.183/
DOI:
Bibkey:
Cite (ACL):
Xiaoyu Xu, Minxin Du, Qingqing Ye, and Haibo Hu. 2025. OBLIVIATE: Robust and Practical Machine Unlearning for Large Language Models. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 3696–3715, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
OBLIVIATE: Robust and Practical Machine Unlearning for Large Language Models (Xu et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.183.pdf
Checklist:
 2025.emnlp-main.183.checklist.pdf