Abstract
Methods for adapting language models (LMs) to new tasks and domains have traditionally assumed white-box access to the model, and work by modifying its parameters. However, this is incompatible with a recent trend in the field, where the highest quality models are only available as black-boxes through inference APIs. Even when the model weights are available, the computational cost of fine-tuning large LMs can be prohibitive for most practitioners. In this work, we present a lightweight method for adapting large LMs to new domains and tasks, assuming no access to their weights or intermediate activations. Our approach fine-tunes a small white-box LM and combines it with the large black-box LM at the probability level through a small network, learned on a small validation set. We validate our approach by adapting a large LM (OPT-30B) to several domains and a downstream task (machine translation), observing improved performance in all cases, of up to 9%, while using a domain expert 23x smaller.- Anthology ID:
- 2023.emnlp-main.180
- Volume:
- Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2961–2974
- Language:
- URL:
- https://aclanthology.org/2023.emnlp-main.180
- DOI:
- 10.18653/v1/2023.emnlp-main.180
- Cite (ACL):
- Aitor Ormazabal, Mikel Artetxe, and Eneko Agirre. 2023. CombLM: Adapting Black-Box Language Models through Small Fine-Tuned Models. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 2961–2974, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- CombLM: Adapting Black-Box Language Models through Small Fine-Tuned Models (Ormazabal et al., EMNLP 2023)
- PDF:
- https://preview.aclanthology.org/emnlp22-frontmatter/2023.emnlp-main.180.pdf