LionGuard 2: Building Lightweight, Data-Efficient & Localised Multilingual Content Moderators

Leanne Tan, Gabriel Chua, Ziyu Ge, Roy Ka-Wei Lee


Abstract
Modern moderation systems increasingly support multiple languages, but often fail to address localisation and low-resource variants—creating safety gaps in real-world deployments. Small models offer a potential alternative to large LLMs, yet still demand considerable data and compute. We present LionGuard 2, a lightweight, multilingual moderation classifier tailored to the Singapore context, supporting English, Chinese, Malay, and partial Tamil. Built on pre-trained OpenAI embeddings and a multi-head ordinal classifier, LionGuard 2 outperforms several commercial and open-source systems across 17 benchmarks, including both Singapore-specific and public English datasets. The system is actively deployed within the Singapore Government, demonstrating practical efficacy at scale. Our findings show that high-quality local data and robust multilingual embeddings can achieve strong moderation performance, without fine-tuning large models. We release our model weights and part of our training data to support future work on LLM safety.
Anthology ID:
2025.emnlp-demos.20
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Ivan Habernal, Peter Schulam, Jörg Tiedemann
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
264–285
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-demos.20/
DOI:
Bibkey:
Cite (ACL):
Leanne Tan, Gabriel Chua, Ziyu Ge, and Roy Ka-Wei Lee. 2025. LionGuard 2: Building Lightweight, Data-Efficient & Localised Multilingual Content Moderators. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, pages 264–285, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
LionGuard 2: Building Lightweight, Data-Efficient & Localised Multilingual Content Moderators (Tan et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-demos.20.pdf