Transforming Podcast Preview Generation: From Expert Models to LLM-Based Systems

Winstead Zhu, Ann Clifton, Azin Ghazimatin, Edgar Tanaka, Ward Ronan


Abstract
Discovering and evaluating long-form talk content such as videos and podcasts poses a significant challenge for users, as it requires a considerable time investment. Previews offer a practical solution by providing concise snippets that showcase key moments of the content, enabling users to make more informed and confident choices. We propose an LLM-based approach for generating podcast episode previews and deploy the solution at scale, serving hundreds of thousands of podcast previews in a real-world application. Comprehensive offline evaluations and online A/B testing demonstrate that LLM-generated previews consistently outperform a strong baseline built on top of various ML expert models, showcasing a significant reduction in the need for meticulous feature engineering. The offline results indicate notable enhancements in understandability, contextual clarity, and interest level, and the online A/B test shows a 4.6% increase in user engagement with preview content, along with a 5x boost in processing efficiency, offering a more streamlined and performant solution compared to the strong baseline of feature-engineered expert models.
Anthology ID:
2025.acl-industry.26
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 6: Industry Track)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Georg Rehm, Yunyao Li
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
336–344
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.acl-industry.26/
DOI:
Bibkey:
Cite (ACL):
Winstead Zhu, Ann Clifton, Azin Ghazimatin, Edgar Tanaka, and Ward Ronan. 2025. Transforming Podcast Preview Generation: From Expert Models to LLM-Based Systems. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 6: Industry Track), pages 336–344, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Transforming Podcast Preview Generation: From Expert Models to LLM-Based Systems (Zhu et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.acl-industry.26.pdf