@inproceedings{zhang-etal-2023-multi-teacher,
    title = "Multi-teacher Distillation for Multilingual Spelling Correction",
    author = "Zhang, Jingfen  and
      Guo, Xuan  and
      Bodapati, Sravan  and
      Potts, Christopher",
    editor = "Wang, Mingxuan  and
      Zitouni, Imed",
    booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing: Industry Track",
    month = dec,
    year = "2023",
    address = "Singapore",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2023.emnlp-industry.15/",
    doi = "10.18653/v1/2023.emnlp-industry.15",
    pages = "142--151",
    abstract = "Accurate spelling correction is a critical step in modern search interfaces, especially in an era of mobile devices and speech-to-text interfaces. For services that are deployed around the world, this poses a significant challenge for multilingual NLP: spelling errors need to be caught and corrected in all languages, and even in queries that use multiple languages. In this paper, we tackle this challenge using multi-teacher distillation. On our approach, a monolingual teacher model is trained for each language/locale, and these individual models are distilled into a single multilingual student model intended to serve all languages/locales. In experiments using open-source data as well as customer data from a worldwide search service, we show that this leads to highly effective spelling correction models that can meet the tight latency requirements of deployed services."
}Markdown (Informal)
[Multi-teacher Distillation for Multilingual Spelling Correction](https://preview.aclanthology.org/ingest-emnlp/2023.emnlp-industry.15/) (Zhang et al., EMNLP 2023)
ACL