Pushdown Layers: Encoding Recursive Structure in Transformer Language Models
Shikhar Murty, Pratyusha Sharma, Jacob Andreas, Christopher Manning
Abstract
Recursion is a prominent feature of human language, and fundamentally challenging for self-attention due to the lack of an explicit recursive-state tracking mechanism. Consequently, Transformer language models poorly capture long-tail recursive structure and exhibit sample-inefficient syntactic generalization. This work introduces Pushdown Layers, a new self-attention layer that models recursive state via a stack tape that tracks estimated depths of every token in an incremental parse of the observed prefix. Transformer LMs with Pushdown Layers are syntactic language models that autoregressively and synchronously update this stack tape as they predict new tokens, in turn using the stack tape to softly modulate attention over tokens—for instance, learning to “skip” over closed constituents. When trained on a corpus of strings annotated with silver constituency parses, Transformers equipped with Pushdown Layers achieve dramatically better and 3-5x more sample-efficient syntactic generalization, while maintaining similar perplexities. Pushdown Layers are a drop-in replacement for standard self-attention. We illustrate this by finetuning GPT2-medium with Pushdown Layers on an automatically parsed WikiText-103, leading to improvements on several GLUE text classification tasks.- Anthology ID:
- 2023.emnlp-main.195
- Volume:
- Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 3233–3247
- Language:
- URL:
- https://aclanthology.org/2023.emnlp-main.195
- DOI:
- 10.18653/v1/2023.emnlp-main.195
- Cite (ACL):
- Shikhar Murty, Pratyusha Sharma, Jacob Andreas, and Christopher Manning. 2023. Pushdown Layers: Encoding Recursive Structure in Transformer Language Models. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 3233–3247, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Pushdown Layers: Encoding Recursive Structure in Transformer Language Models (Murty et al., EMNLP 2023)
- PDF:
- https://preview.aclanthology.org/ingest-bitext-workshop/2023.emnlp-main.195.pdf