What the Weight?! A Unified Framework for Zero-Shot Knowledge Composition

Carolin Holtermann, Markus Frohmann, Navid Rekabsaz, Anne Lauscher


Abstract
The knowledge encapsulated in a model is the core factor determining its final performance on downstream tasks. Much research in NLP has focused on efficient methods for storing and adapting different types of knowledge, e.g., in dedicated modularized structures, and on how to effectively combine these, e.g., by learning additional parameters. However, given the many possible options, a thorough understanding of the mechanisms involved in these compositions is missing, and hence it remains unclear which strategies to utilize. To address this research gap, we propose a novel framework for zero-shot module composition, which encompasses existing and some novel variations for selecting, weighting, and combining parameter modules under a single unified notion. Focusing on the scenario of domain knowledge and adapter layers, our framework provides a systematic unification of concepts, allowing us to conduct the first comprehensive benchmarking study of various zero-shot knowledge composition strategies. In particular, we test two module combination methods and five selection and weighting strategies for their effectiveness and efficiency in an extensive experimental setup. Our results highlight the efficacy of ensembling but also hint at the power of simple though often-ignored weighting methods. Further in-depth analyses allow us to understand the role of weighting vs. top-k selection, and show that, to a certain extent, the performance of adapter composition can even be predicted.
Anthology ID:
2024.findings-eacl.77
Volume:
Findings of the Association for Computational Linguistics: EACL 2024
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1138–1157
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-eacl.77/
DOI:
Bibkey:
Cite (ACL):
Carolin Holtermann, Markus Frohmann, Navid Rekabsaz, and Anne Lauscher. 2024. What the Weight?! A Unified Framework for Zero-Shot Knowledge Composition. In Findings of the Association for Computational Linguistics: EACL 2024, pages 1138–1157, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
What the Weight?! A Unified Framework for Zero-Shot Knowledge Composition (Holtermann et al., Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-eacl.77.pdf
Software:
 2024.findings-eacl.77.software.zip
Video:
 https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-eacl.77.mp4