MemoryPrompt: A Light Wrapper to Improve Context Tracking in Pre-trained Language Models

Nathanael Carraz Rakotonirina, Marco Baroni


Abstract
Transformer-based language models (LMs) track contextual information through large, hard-coded input windows. We introduce MemoryPrompt, a leaner approach in which the LM is complemented by a small auxiliary recurrent network that passes information to the LM by prefixing its regular input with a sequence of vectors, akin to soft prompts, without requiring LM finetuning. Tested on a task designed to probe a LM’s ability to keep track of multiple fact updates, a MemoryPrompt-augmented LM outperforms much larger LMs that have access to the full input history. We also test MemoryPrompt on a long-distance dialogue dataset, where its performance is comparable to that of a model conditioned on the entire conversation history. In both experiments we also observe that, unlike full-finetuning approaches, MemoryPrompt does not suffer from catastrophic forgetting when adapted to new tasks, thus not disrupting the generalist capabilities of the underlying LM.
Anthology ID:
2024.lrec-main.976
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
11187–11195
Language:
URL:
https://aclanthology.org/2024.lrec-main.976
DOI:
Bibkey:
Cite (ACL):
Nathanael Carraz Rakotonirina and Marco Baroni. 2024. MemoryPrompt: A Light Wrapper to Improve Context Tracking in Pre-trained Language Models. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 11187–11195, Torino, Italia. ELRA and ICCL.
Cite (Informal):
MemoryPrompt: A Light Wrapper to Improve Context Tracking in Pre-trained Language Models (Rakotonirina & Baroni, LREC-COLING 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2024.lrec-main.976.pdf