zFLoRA: Zero-Latency Fused Low-Rank Adapters

Dhananjaya Gowda, Seoha Song, Harshith Goka, Junhyun Lee


Abstract
Large language models (LLMs) are increasingly deployed with task-specific adapters catering to multiple downstream applications. In such a scenario, the additional compute associated with these apparently insignificant number of adapter parameters (typically less than 1% of the base model) turns out to be disproportionately significant during inference time (up to 2.5x times that of the base model). In this paper, we propose a new zero-latency fused low-rank adapter (zFLoRA) that introduces zero or negligible latency overhead on top of the base model. Experimental results on LLMs of size 1B, 3B and 7B show that zFLoRA compares favorably against the popular supervised fine-tuning benchmarks including low-rank adapters (LoRA) as well as full fine-tuning (FFT). Experiments are conducted on 18 different tasks across three different categories namely commonsense reasoning, math reasoning and summary-dialogue. Latency measurements made on NPU (Samsung Galaxy S25+) as well as GPU (NVIDIA H100) platforms show that the proposed zFLoRA adapters introduce zero to negligible latency overhead.
Anthology ID:
2025.emnlp-main.1086
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
21412–21429
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1086/
DOI:
Bibkey:
Cite (ACL):
Dhananjaya Gowda, Seoha Song, Harshith Goka, and Junhyun Lee. 2025. zFLoRA: Zero-Latency Fused Low-Rank Adapters. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 21412–21429, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
zFLoRA: Zero-Latency Fused Low-Rank Adapters (Gowda et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1086.pdf
Checklist:
 2025.emnlp-main.1086.checklist.pdf