Mixture of LoRA Experts for Continual Information Extraction with LLMs

Zitao Wang, Xinyi Wang, Wei Hu


Abstract
We study continual information extraction (IE), which aims to extract emerging information across diverse IE tasks incessantly while avoiding forgetting. Existing approaches are either task-specialized for a single IE task or suffer from catastrophic forgetting and insufficient knowledge transfer in continual IE. This paper proposes a new continual IE model using token-level mixture of LoRA experts with LLMs. We leverage a LoRA router to route each token to the most relevant LoRA experts, facilitating effective knowledge transfer among IE tasks. We guide task experts’ selection by task keys to retain the IE task-specific knowledge and mitigate catastrophic forgetting. We design a gate reflection method based on knowledge distillation to address forgetting in the LoRA router and task keys. The experimental results show that our model achieves state-of-the-art performance, effectively mitigating catastrophic forgetting and enhancing knowledge transfer in continual IE.
Anthology ID:
2025.findings-emnlp.718
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13324–13339
Language:
URL:
https://preview.aclanthology.org/name-variant-enfa-fane/2025.findings-emnlp.718/
DOI:
10.18653/v1/2025.findings-emnlp.718
Bibkey:
Cite (ACL):
Zitao Wang, Xinyi Wang, and Wei Hu. 2025. Mixture of LoRA Experts for Continual Information Extraction with LLMs. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 13324–13339, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Mixture of LoRA Experts for Continual Information Extraction with LLMs (Wang et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/name-variant-enfa-fane/2025.findings-emnlp.718.pdf
Checklist:
 2025.findings-emnlp.718.checklist.pdf