Meta-Learning Online Adaptation of Language Models

Nathan Hu, Eric Mitchell, Christopher Manning, Chelsea Finn


Abstract
Large language models encode impressively broad world knowledge in their parameters. However, the knowledge in static language models falls out of date, limiting the model’s effective “shelf life.” While online fine-tuning can reduce this degradation, we find that naively fine-tuning on a stream of documents leads to a low level of information uptake. We hypothesize that online fine-tuning does not sufficiently attend to important information. That is, the gradient signal from important tokens representing factual information is drowned out by the gradient from inherently noisy tokens, suggesting that a dynamic, context-aware learning rate may be beneficial. We therefore propose learning which tokens to upweight. We meta-train a small, autoregressive model to reweight the language modeling loss for each token during online fine-tuning, with the objective of maximizing the out-of-date base question-answering model’s ability to answer questions about a document after a single weighted gradient step. We call this approach Context-aware Meta-learned Loss Scaling (CaMeLS). Across three different distributions of documents, our experiments find that CaMeLS provides substantially improved information uptake on streams of thousands of documents compared with standard fine-tuning and baseline heuristics for reweighting token losses.
Anthology ID:
2023.emnlp-main.268
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4418–4432
Language:
URL:
https://aclanthology.org/2023.emnlp-main.268
DOI:
10.18653/v1/2023.emnlp-main.268
Bibkey:
Cite (ACL):
Nathan Hu, Eric Mitchell, Christopher Manning, and Chelsea Finn. 2023. Meta-Learning Online Adaptation of Language Models. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 4418–4432, Singapore. Association for Computational Linguistics.
Cite (Informal):
Meta-Learning Online Adaptation of Language Models (Hu et al., EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2023.emnlp-main.268.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-4/2023.emnlp-main.268.mp4