STARE at the Structure: Steering ICL Exemplar Selection with Structural Alignment

Jiaqian Li, Qisheng Hu, Jing Li, Wenya Wang


Abstract
In-Context Learning (ICL) has become a powerful paradigm that enables LLMs to perform a wide range of tasks without task-specific fine-tuning. However, the effectiveness of ICL heavily depends on the quality of exemplar selection. In particular, for structured prediction tasks such as semantic parsing, existing ICL selection strategies often overlook structural alignment, leading to suboptimal performance and poor generalization. To address this issue, we propose a novel two-stage exemplar selection strategy that achieves a strong balance between efficiency, generalizability, and performance. First, we fine-tune a BERT-based retriever using structure-aware supervision, guiding it to select exemplars that are both semantically relevant and structurally aligned. Then, we enhance the retriever with a plug-in module, which amplifies syntactically meaningful information in the hidden representations. This plug-in is model-agnostic, requires minimal overhead, and can be seamlessly integrated into existing pipelines. Experiments on four benchmarks spanning three semantic parsing tasks demonstrate that our method consistently outperforms existing baselines with multiple recent LLMs as inference-time models.
Anthology ID:
2025.emnlp-main.746
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14776–14793
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.746/
DOI:
Bibkey:
Cite (ACL):
Jiaqian Li, Qisheng Hu, Jing Li, and Wenya Wang. 2025. STARE at the Structure: Steering ICL Exemplar Selection with Structural Alignment. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 14776–14793, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
STARE at the Structure: Steering ICL Exemplar Selection with Structural Alignment (Li et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.746.pdf
Checklist:
 2025.emnlp-main.746.checklist.pdf