ReSeeding Latent States for Sequential Language Understanding
Stéphane Aroca-Ouellette, Katharina von der Wense, Alessandro Roncone
Abstract
We introduce Refeeding State Embeddings aligned using Environmental Data (ReSEED), a novel method for grounding language in environmental data. While large language models (LLMs) excel at many tasks, they continue to struggle with multi-step sequential reasoning. ReSEED addresses this by producing latent embeddings aligned with the true state of the environment and refeeding these embeddings into the model before generating its output. To evaluate its effectiveness, we develop three new sequential reasoning benchmarks, each with a training set of paired state-text trajectories and several text-only evaluation sets that test generalization to longer, unseen trajectories. Across all benchmarks, ReSEED significantly improves generalization and scalability over a text-only baseline. We further show that ReSEED outperforms commercial LLMs on our benchmarks, highlighting the value of grounding language in the environment.- Anthology ID:
- 2025.emnlp-main.1281
- Volume:
- Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
- Month:
- November
- Year:
- 2025
- Address:
- Suzhou, China
- Editors:
- Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 25233–25247
- Language:
- URL:
- https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1281/
- DOI:
- Cite (ACL):
- Stéphane Aroca-Ouellette, Katharina von der Wense, and Alessandro Roncone. 2025. ReSeeding Latent States for Sequential Language Understanding. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 25233–25247, Suzhou, China. Association for Computational Linguistics.
- Cite (Informal):
- ReSeeding Latent States for Sequential Language Understanding (Aroca-Ouellette et al., EMNLP 2025)
- PDF:
- https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1281.pdf