Exploring the Hidden Capacity of LLMs for One-Step Text Generation

Gleb Mezentsev, Ivan Oseledets


Abstract
A recent study showed that large language models (LLMs) can reconstruct surprisingly long texts — up to thousands of tokens — via autoregressive generation from just one trained input embedding. In this work, we explore whether autoregressive decoding is essential for such reconstruction. We show that frozen LLMs can generate hundreds of accurate tokens in just one token-parallel forward pass, when provided with only two learned embeddings. This reveals a surprising and underexplored multi-token generation capability of autoregressive LLMs. We examine these embeddings and characterize the information they encode. We also empirically show that, although these representations are not unique for a given text, they form connected and local regions in embedding space — suggesting the potential to train a practical encoder. The existence of such representations hints that multi-token generation may be natively accessible in off-the-shelf LLMs via a learned input encoder, eliminating heavy retraining and helping to overcome the fundamental bottleneck of autoregressive decoding while reusing already-trained models.
Anthology ID:
2025.emnlp-main.1165
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
22891–22900
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1165/
DOI:
Bibkey:
Cite (ACL):
Gleb Mezentsev and Ivan Oseledets. 2025. Exploring the Hidden Capacity of LLMs for One-Step Text Generation. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 22891–22900, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Exploring the Hidden Capacity of LLMs for One-Step Text Generation (Mezentsev & Oseledets, EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1165.pdf
Checklist:
 2025.emnlp-main.1165.checklist.pdf