Siegfried Kunzmann
2025
MaZO: Masked Zeroth-Order Optimization for Multi-Task Fine-Tuning of Large Language Models
Zhen Zhang
|
Yifan Yang
|
Kai Zhen
|
Nathan Susanj
|
Athanasios Mouchtaris
|
Siegfried Kunzmann
|
Zheng Zhang
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Large language models have demonstrated exceptional capabilities across diverse tasks, but their fine-tuning demands significant memory, posing challenges for resource-constrained environments. Zeroth-order (ZO) optimization provides a memory-efficient alternative by eliminating the need for backpropagation. However, ZO optimization suffers from high gradient variance, and prior research has largely focused on single-task learning, leaving its application to multi-task learning unexplored. Multi-task learning is crucial for leveraging shared knowledge across tasks to improve generalization, yet it introduces unique challenges under ZO settings, such as amplified gradient variance and collinearity. In this paper, we present MaZO, the first framework specifically designed for multi-task LLM fine-tuning under ZO optimization. MaZO tackles these challenges at the parameter level through two key innovations: a weight importance metric to identify critical parameters and a multi-task weight update mask to selectively update these parameters, reducing the dimensionality of the parameter space and mitigating task conflicts. Experiments demonstrate that MaZO achieves state-of-the-art performance, surpassing even multi-task learning methods designed for first-order optimization.
2000
SPEECON - Speech Data for Consumer Devices
Rainer Siemund
|
Harald Höge
|
Siegfried Kunzmann
|
Krzysztof Marasek
Proceedings of the Second International Conference on Language Resources and Evaluation (LREC’00)
Search
Fix author
Co-authors
- Harald Höge 1
- Krzysztof Marasek 1
- Athanasios Mouchtaris 1
- Rainer Siemund 1
- Nathan Susanj 1
- show all...