Mapping the Course for Prompt-based Structured Prediction

Matt Pauk, Maria Leonor Pacheco


Abstract
Large language models (LLMs) have demonstrated strong performance in a wide-range of language tasks without requiring task-specific fine-tuning. However, they remain prone to hallucinations and inconsistencies, and often struggle with complex reasoning, in part due to the limitations of autoregressive generation. We propose to address some of these issues, particularly for structured prediction, by combining LLMs with combinatorial inference to marry the predictive power of LLMs with the structural consistency provided by inference methods. We perform exhaustive experiments in an effort to understand which prompting strategies can best estimate confidence values for downstream symbolic inference, and find that, independent of prompting strategy, incorporating symbolic inference yields more consistent and accurate predictions than prompting alone. Finally, we show that calibration and fine-tuning with structured learning objectives further increases performance on challenging tasks, highlighting that structured learning remains valuable in the era of LLMs.
Anthology ID:
2026.eacl-long.160
Volume:
Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3483–3508
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-long.160/
DOI:
Bibkey:
Cite (ACL):
Matt Pauk and Maria Leonor Pacheco. 2026. Mapping the Course for Prompt-based Structured Prediction. In Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3483–3508, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Mapping the Course for Prompt-based Structured Prediction (Pauk & Pacheco, EACL 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-long.160.pdf