SudoLM: Learning Access Control of Parametric Knowledge with Authorization Alignment

Qin Liu, Fei Wang, Chaowei Xiao, Muhao Chen


Abstract
Existing preference alignment is a one-size-fits-all alignment mechanism, where the part of the large language model (LLM) parametric knowledge with non-preferred features is uniformly blocked to all the users. However, this part of knowledge can be useful to advanced users whose expertise qualifies them to handle these information. The one-size-fits-all alignment mechanism undermines LLM’s utility for these qualified users. To address this problem, we propose SudoLM, a framework that lets LLMs learn access control over specific parametric knowledge for users with different credentials via authorization alignment. SudoLM allows authorized users to unlock their access to all the parametric knowledge with an assigned Sudo key while blocking access to non-qualified users. Experiments on two application scenarios demonstrate that SudoLM effectively controls the user’s access to the parametric knowledge and maintains its general utility.
Anthology ID:
2025.acl-long.1318
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
27169–27181
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1318/
DOI:
Bibkey:
Cite (ACL):
Qin Liu, Fei Wang, Chaowei Xiao, and Muhao Chen. 2025. SudoLM: Learning Access Control of Parametric Knowledge with Authorization Alignment. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 27169–27181, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
SudoLM: Learning Access Control of Parametric Knowledge with Authorization Alignment (Liu et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1318.pdf