Beyond Demonstrations: Dynamic Vector Construction from Latent Representations

Wang Cai, Hsiu-Yuan Huang, Zhixiang Wang, Yunfang Wu


Abstract
In-Context derived Vector (ICV) methods extract task-relevant representations from large language models (LLMs) and reinject them during inference, achieving comparable performance to few-shot In-Context Learning (ICL) without repeated demonstration processing. However, existing ICV methods remain sensitive to ICL-specific factors, often use coarse or semantically fragmented representations as the source of the vector, and rely on heuristic-based injection positions, limiting their applicability.To address these issues, we propose Dynamic Vector (DyVec), which incorporates an Exhaustive Query Rotation (EQR) strategy to extract robust semantically aggregated latent representations by mitigating variance introduced by ICL. It then applies Dynamic Latent Segmentation and Injection to adaptively partition representations based on task complexity and leverages REINFORCE-based optimization to learn optimal injection positions for each segment.Experiments results show that DyVec outperforms few-shot ICL, LoRA, and prior ICV baselines. Further analysis highlights the effectiveness of dynamically segmenting and injecting semantically aggregated latent representations. DyVec provides a lightweight and data-efficient solution for inference-time task adaptation.
Anthology ID:
2025.emnlp-main.297
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5853–5868
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.297/
DOI:
Bibkey:
Cite (ACL):
Wang Cai, Hsiu-Yuan Huang, Zhixiang Wang, and Yunfang Wu. 2025. Beyond Demonstrations: Dynamic Vector Construction from Latent Representations. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 5853–5868, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Beyond Demonstrations: Dynamic Vector Construction from Latent Representations (Cai et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.297.pdf
Checklist:
 2025.emnlp-main.297.checklist.pdf