Representing LLMs in Prompt Semantic Task Space

Idan Kashani, Avi Mendelson, Yaniv Nemcovsky


Abstract
Large language models (LLMs) achieve impressive results over various tasks, and ever-expanding public repositories contain an abundance of pre-trained models. Therefore, identifying the best-performing LLM for a given task is a significant challenge. Previous works have suggested learning LLM representations to address this. However, these approaches present limited scalability and require costly retraining to encompass additional models and datasets. Moreover, the produced representation utilizes distinct spaces that cannot be easily interpreted. This work presents an efficient, training-free approach to representing LLMs as linear operators within the prompts’ semantic task space, thus providing a highly interpretable representation of the models’ application. Our method utilizes closed-form computation of geometrical properties and ensures exceptional scalability and real-time adaptability to dynamically expanding repositories. We demonstrate our approach on success prediction and model selection tasks, achieving competitive or state-of-the-art results with notable performance in out-of-sample scenarios.
Anthology ID:
2025.findings-emnlp.456
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8578–8597
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.456/
DOI:
10.18653/v1/2025.findings-emnlp.456
Bibkey:
Cite (ACL):
Idan Kashani, Avi Mendelson, and Yaniv Nemcovsky. 2025. Representing LLMs in Prompt Semantic Task Space. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 8578–8597, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Representing LLMs in Prompt Semantic Task Space (Kashani et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.456.pdf
Checklist:
 2025.findings-emnlp.456.checklist.pdf