Duncan Soiffer


2025

pdf bib
Semantic Agreement Enables Efficient Open-Ended LLM Cascades
Duncan Soiffer | Steven Kolawole | Virginia Smith
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track

Cascade systems for open-ended text generation face a fundamental challenge: determining output reliability when generation quality lies on a continuous spectrum, often with multiple valid responses. To address this, we propose _semantic agreement_—meaning-level consensus between ensemble outputs—as a training-free signal for reliable deferral. We show that when diverse model outputs agree semantically, their consensus is a stronger reliability signal than token-level confidence. Evaluated from 500M to 70B-parameter models, semantic cascades improve deferral accuracy, match or surpass target-model quality at 40% of the cost, and reduce latency by up to 60%. Our method requires no model internals, works across black-box APIs, and remains robust to model updates, making it a practical baseline for real-world LLM deployment.