Semantic Agreement Enables Efficient Open-Ended LLM Cascades

Duncan Soiffer, Steven Kolawole, Virginia Smith


Abstract
Cascade systems for open-ended text generation face a fundamental challenge: determining output reliability when generation quality lies on a continuous spectrum, often with multiple valid responses. To address this, we propose _semantic agreement_—meaning-level consensus between ensemble outputs—as a training-free signal for reliable deferral. We show that when diverse model outputs agree semantically, their consensus is a stronger reliability signal than token-level confidence. Evaluated from 500M to 70B-parameter models, semantic cascades improve deferral accuracy, match or surpass target-model quality at 40% of the cost, and reduce latency by up to 60%. Our method requires no model internals, works across black-box APIs, and remains robust to model updates, making it a practical baseline for real-world LLM deployment.
Anthology ID:
2025.emnlp-industry.171
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track
Month:
November
Year:
2025
Address:
Suzhou (China)
Editors:
Saloni Potdar, Lina Rojas-Barahona, Sebastien Montella
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2499–2537
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.171/
DOI:
Bibkey:
Cite (ACL):
Duncan Soiffer, Steven Kolawole, and Virginia Smith. 2025. Semantic Agreement Enables Efficient Open-Ended LLM Cascades. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 2499–2537, Suzhou (China). Association for Computational Linguistics.
Cite (Informal):
Semantic Agreement Enables Efficient Open-Ended LLM Cascades (Soiffer et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.171.pdf