MediSwift: Efficient Sparse Pre-trained Biomedical Language Models
Vithursan Thangarasa, Mahmoud Salem, Shreyas Saxena, Chen-Yu Leong, Joel Hestness, Sean Lie
Abstract
Large language models (LLMs) are typically trained on general source data forvarious domains, but a recent surge in domain-specific LLMs has shown theirpotential to outperform general-purpose models in domain-specific tasks (e.g.,biomedicine). Although domain-specific pre-training enhances efficiency andleads to smaller models, the computational costs of training these LLMs remainhigh, posing budgeting challenges. We introduce MediSwift, a suite of biomedicalLMs that leverage sparse pre-training on domain-specific biomedical text data.By inducing up to 75% weight sparsity during the pre-training phase, MediSwiftachieves a 2-2.5x reduction in training FLOPs. Notably, all sparse pre-trainingwas performed on the Cerebras CS-2 system, which is specifically designed torealize the acceleration benefits from unstructured weight sparsity, therebysignificantly enhancing the efficiency of the MediSwift models. Throughsubsequent dense fine-tuning and strategic soft prompting, MediSwift modelsoutperform existing LLMs up to 7B parameters on biomedical tasks, setting newbenchmarks w.r.t efficiency-accuracy on tasks such as PubMedQA. Our results showthat sparse pre-training, along with dense fine-tuning and soft prompting,offers an effective method for creating high-performing, computationallyefficient models in specialized domains.- Anthology ID:
- 2024.findings-acl.14
- Volume:
- Findings of the Association for Computational Linguistics ACL 2024
- Month:
- August
- Year:
- 2024
- Address:
- Bangkok, Thailand and virtual meeting
- Editors:
- Lun-Wei Ku, Andre Martins, Vivek Srikumar
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 214–230
- Language:
- URL:
- https://aclanthology.org/2024.findings-acl.14
- DOI:
- Cite (ACL):
- Vithursan Thangarasa, Mahmoud Salem, Shreyas Saxena, Chen-Yu Leong, Joel Hestness, and Sean Lie. 2024. MediSwift: Efficient Sparse Pre-trained Biomedical Language Models. In Findings of the Association for Computational Linguistics ACL 2024, pages 214–230, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
- Cite (Informal):
- MediSwift: Efficient Sparse Pre-trained Biomedical Language Models (Thangarasa et al., Findings 2024)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2024.findings-acl.14.pdf