Learning from LLM Agents: In-Context Generative Models for Text Casing in E-Commerce Ads

Yingxue Zhou, Tan Zhu, Tao Zeng, Zigeng Wang, Wei Shen


Abstract
E-commerce ad platforms enforce content policies and review created ads before publication, with casing requirements playing a critical role in maintaining readability and brand consistency. Existing NER-based transformer models have been widely used for casing correction, but they process characters independently in a classification-based manner, failing to capture sentence level contextual dependencies, making them less reliable when handling unseen or ad-specific terms, e.g., brand names. LLMs like ChatGPT offer better generalization to proper nouns, but they are expensive and have high latency. Besides, generative model can suffer from hallucination. To address these challenges, we propose a two-stage approach: (1) an LLM-based Agent leveraging Chain-of-Actions (CoA) to enforce casing policies while accurately handling ads-specific terms, such as brand names, and (2) a lightweight generative model that preserves the LLM Agent’s knowledge while significantly reducing latency and costs. We design a novel in-context decoding strategy, which avoids hallucinations. Our approach outperforms NER-based methods and achieves near-LLM Agent performance, making it a scalable and efficient solution for real-world ad compliance automation.
Anthology ID:
2025.emnlp-industry.79
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track
Month:
November
Year:
2025
Address:
Suzhou (China)
Editors:
Saloni Potdar, Lina Rojas-Barahona, Sebastien Montella
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1122–1133
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.79/
DOI:
Bibkey:
Cite (ACL):
Yingxue Zhou, Tan Zhu, Tao Zeng, Zigeng Wang, and Wei Shen. 2025. Learning from LLM Agents: In-Context Generative Models for Text Casing in E-Commerce Ads. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 1122–1133, Suzhou (China). Association for Computational Linguistics.
Cite (Informal):
Learning from LLM Agents: In-Context Generative Models for Text Casing in E-Commerce Ads (Zhou et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.79.pdf