GenKnowSub: Improving Modularity and Reusability of LLMs through General Knowledge Subtraction

Mohammadtaha Bagherifard, Sahar Rajabi, Ali Edalat, Yadollah Yaghoobzadeh


Abstract
Large language models (LLMs) often struggle with zero-shot generalization, and several modular approaches have been proposed to address this challenge. Yet, we hypothesize that a key limitation remains: the entanglement of general knowledge and task-specific adaptations. To overcome this, we propose a modular framework that disentangles these components by constructing a library of task-specific LoRA modules alongside a general-domain LoRA. By subtracting this general knowledge component from each task-specific module, we obtain residual modules that focus more exclusively on task-relevant information. We call this approach general knowledge subtraction or GenKnowSub. Leveraging the refined task-specific modules and the Arrow routing algorithm, we dynamically select and combine modules for new inputs without additional training. Our studies on the Phi-3 model and standard Arrow as baselines reveal that using general knowledge LoRAs derived from diverse languages, including English, French, and German, yields consistent performance gains in both monolingual and cross-lingual settings across a wide set of benchmarks. Further experiments on Phi-2 reveal how GenKnowSub generalizes to a weaker LLM.
Anthology ID:
2025.acl-short.54
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
685–694
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.acl-short.54/
DOI:
Bibkey:
Cite (ACL):
Mohammadtaha Bagherifard, Sahar Rajabi, Ali Edalat, and Yadollah Yaghoobzadeh. 2025. GenKnowSub: Improving Modularity and Reusability of LLMs through General Knowledge Subtraction. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 685–694, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
GenKnowSub: Improving Modularity and Reusability of LLMs through General Knowledge Subtraction (Bagherifard et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.acl-short.54.pdf