Memory-Efficient Backpropagation for Fine-Tuning LLMs on Resource-Constrained Mobile Devices

Congzheng Song, Xinyu Tang


Abstract
Fine-tuning large language models (LLMs) with backpropagation–even for a subset of parameters such as LoRA–can be much more memory-consuming than inference and is often deemed impractical for resource-constrained mobile devices. Alternative methods, such as zeroth-order optimization (ZO), can greatly reduce the memory footprint but come at the cost of significantly slower model convergence (10× to 100× more steps than backpropagation). We propose a memory-efficient implementation of backpropagation (MeBP) on mobile devices that allows flexible trade-offs between memory usage and compute time, while converging faster and achieving better performance than the ZO baseline. We verify the effectiveness of MeBP on an iPhone 15 Pro Max and show that various LLMs, ranging from 0.5B to 4B parameters, can be fine-tuned using less than 1GB of memory. We release an example of the MeBP implementation at https://github.com/apple/ml-mebp.
Anthology ID:
2025.emnlp-industry.52
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track
Month:
November
Year:
2025
Address:
Suzhou (China)
Editors:
Saloni Potdar, Lina Rojas-Barahona, Sebastien Montella
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
766–774
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.52/
DOI:
Bibkey:
Cite (ACL):
Congzheng Song and Xinyu Tang. 2025. Memory-Efficient Backpropagation for Fine-Tuning LLMs on Resource-Constrained Mobile Devices. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 766–774, Suzhou (China). Association for Computational Linguistics.
Cite (Informal):
Memory-Efficient Backpropagation for Fine-Tuning LLMs on Resource-Constrained Mobile Devices (Song & Tang, EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.52.pdf