GenPoE: Generative Passage-level Mixture of Experts for Knowledge Enhancement of LLMs

Xuebing Liu, Shanbao Qiao, Seung-Hoon Na


Abstract
Typically, parametric adaptation methods such as domain-adaptive pretraining (DAP) and retrieval-augmented generation (RAG) have been considered effective approaches for adapting large language models (LLMs) to new knowledge or domains. To unify positive effects of parametric adaptation and RAG, this paper proposes GenPoE, i.e., “generative’’ passage-level mixture of experts (MoEs) for enhancing knowledge of LLMs. The key component is its novel MoE-generating hypernetwork which takes in-context retrieved passages and generates their “expert’’ parameters, where these generated parameters are then integrated into LLMs by forming expert networks. With its use of “generated’’ parameters, GenPoE does not require a separate parameter training or finetuning stage, which is often costly. By parameterizing passages into expert networks, GenPoE likely exhibits robustness even when the retrieved passages are irrelevant. Experiment results in two open-domain question answering (QA) tasks present that GenPoE shows improved performances over other passage-level knowledge editing, and its combination of RAG produces superior performances over RAG. Our data and code will be available at https://github.com/Liu-Xuebing/GenPoE.
Anthology ID:
2025.findings-emnlp.272
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5082–5097
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.272/
DOI:
10.18653/v1/2025.findings-emnlp.272
Bibkey:
Cite (ACL):
Xuebing Liu, Shanbao Qiao, and Seung-Hoon Na. 2025. GenPoE: Generative Passage-level Mixture of Experts for Knowledge Enhancement of LLMs. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 5082–5097, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
GenPoE: Generative Passage-level Mixture of Experts for Knowledge Enhancement of LLMs (Liu et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.272.pdf
Checklist:
 2025.findings-emnlp.272.checklist.pdf