Rapid Word Learning Through Meta In-Context Learning

Wentao Wang, Guangyuan Jiang, Tal Linzen, Brenden Lake


Abstract
Humans can quickly learn a new word from a few illustrative examples, and then systematically and flexibly use it in novel contexts. Yet the abilities of current language models for few-shot word learning, and methods for improving these abilities, are underexplored. In this study, we introduce a novel method, Meta-training for IN-context learNing Of Words (Minnow). This method trains language models to generate new examples of a word’s usage given a few in-context examples, using a special placeholder token to represent the new word. This training is repeated on many new words to develop a general word-learning ability. We find that training models from scratch with Minnow on human-scale child-directed language enables strong few-shot word learning, comparable to a large language model (LLM) pre-trained on orders of magnitude more data. Furthermore, through discriminative and generative evaluations, we demonstrate that finetuning pre-trained LLMs with Minnow improves their ability to discriminate between new words, identify syntactic categories of new words, and generate reasonable new usages and definitions for new words, based on one or a few in-context examples. These findings highlight the data efficiency of Minnow and its potential to improve language model performance in word learning tasks.
Anthology ID:
2025.emnlp-main.1631
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
32026–32061
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1631/
DOI:
Bibkey:
Cite (ACL):
Wentao Wang, Guangyuan Jiang, Tal Linzen, and Brenden Lake. 2025. Rapid Word Learning Through Meta In-Context Learning. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 32026–32061, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Rapid Word Learning Through Meta In-Context Learning (Wang et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1631.pdf
Checklist:
 2025.emnlp-main.1631.checklist.pdf