Sci-LoRA: Mixture of Scientific LoRAs for Cross-Domain Lay Paraphrasing

Ming Cheng, Jiaying Gong, Hoda Eldardiry


Abstract
Lay paraphrasing aims to make scientific information accessible to audiences without technical backgrounds. However, most existing studies focus on a single domain, such as biomedicine. With the rise of interdisciplinary research, it is increasingly necessary to comprehend knowledge spanning multiple technical fields. To address this, we propose Sci-LoRA, a model that leverages a mixture of LoRAs fine-tuned on multiple scientific domains. In particular, Sci-LoRA dynamically generates and applies weights for each LoRA, enabling it to adjust the impact of different domains based on the input text, without requiring explicit domain labels. To balance domain-specific knowledge and generalization across various domains, Sci-LoRA integrates information at both the data and model levels. This dynamic fusion enhances the adaptability and performance across various domains. Experimental results across twelve domains on five public datasets show that Sci-LoRA significantly outperforms state-of-the-art large language models and demonstrates flexible generalization and adaptability in cross-domain lay paraphrasing.
Anthology ID:
2025.findings-acl.953
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venues:
Findings | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
18524–18541
Language:
URL:
https://preview.aclanthology.org/acl25-workshop-ingestion/2025.findings-acl.953/
DOI:
Bibkey:
Cite (ACL):
Ming Cheng, Jiaying Gong, and Hoda Eldardiry. 2025. Sci-LoRA: Mixture of Scientific LoRAs for Cross-Domain Lay Paraphrasing. In Findings of the Association for Computational Linguistics: ACL 2025, pages 18524–18541, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Sci-LoRA: Mixture of Scientific LoRAs for Cross-Domain Lay Paraphrasing (Cheng et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/acl25-workshop-ingestion/2025.findings-acl.953.pdf