@inproceedings{liu-etal-2022-knowledge,
    title = "Knowledge Distillation based Contextual Relevance Matching for {E}-commerce Product Search",
    author = "Liu, Ziyang  and
      Wang, Chaokun  and
      Feng, Hao  and
      Wu, Lingfei  and
      Yang, Liqun",
    editor = "Li, Yunyao  and
      Lazaridou, Angeliki",
    booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: Industry Track",
    month = dec,
    year = "2022",
    address = "Abu Dhabi, UAE",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2022.emnlp-industry.5/",
    doi = "10.18653/v1/2022.emnlp-industry.5",
    pages = "63--76",
    abstract = "Online relevance matching is an essential task of e-commerce product search to boost the utility of search engines and ensure a smooth user experience. Previous work adopts either classical relevance matching models or Transformer-style models to address it. However, they ignore the inherent bipartite graph structures that are ubiquitous in e-commerce product search logs and are too inefficient to deploy online. In this paper, we design an efficient knowledge distillation framework for e-commerce relevance matching to integrate the respective advantages of Transformer-style models and classical relevance matching models. Especially for the core student model of the framework, we propose a novel method using k-order relevance modeling. The experimental results on large-scale real-world data (the size is 6 174 million) show that the proposed method significantly improves the prediction accuracy in terms of human relevance judgment. We deploy our method to JD.com online search platform. The A/B testing results show that our method significantly improves most business metrics under price sort mode and default sort mode."
}Markdown (Informal)
[Knowledge Distillation based Contextual Relevance Matching for E-commerce Product Search](https://preview.aclanthology.org/ingest-emnlp/2022.emnlp-industry.5/) (Liu et al., EMNLP 2022)
ACL