BERT-based similarity learning for product matching
Janusz Tracz, Piotr Iwo Wójcik, Kalina Jasinska-Kobus, Riccardo Belluzzo, Robert Mroczkowski, Ireneusz Gawlik
Abstract
Product matching, i.e., being able to infer the product being sold for a merchant-created offer, is crucial for any e-commerce marketplace, enabling product-based navigation, price comparisons, product reviews, etc. This problem proves a challenging task, mostly due to the extent of product catalog, data heterogeneity, missing product representants, and varying levels of data quality. Moreover, new products are being introduced every day, making it difficult to cast the problem as a classification task. In this work, we apply BERT-based models in a similarity learning setup to solve the product matching problem. We provide a thorough ablation study, showing the impact of architecture and training objective choices. Application of transformer-based architectures and proper sampling techniques significantly boosts performance for a range of e-commerce domains, allowing for production deployment.- Anthology ID:
- 2020.ecomnlp-1.7
- Volume:
- Proceedings of Workshop on Natural Language Processing in E-Commerce
- Month:
- Dec
- Year:
- 2020
- Address:
- Barcelona, Spain
- Editors:
- Huasha Zhao, Parikshit Sondhi, Nguyen Bach, Sanjika Hewavitharana, Yifan He, Luo Si, Heng Ji
- Venue:
- EcomNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 66–75
- Language:
- URL:
- https://preview.aclanthology.org/icon-24-ingestion/2020.ecomnlp-1.7/
- DOI:
- Cite (ACL):
- Janusz Tracz, Piotr Iwo Wójcik, Kalina Jasinska-Kobus, Riccardo Belluzzo, Robert Mroczkowski, and Ireneusz Gawlik. 2020. BERT-based similarity learning for product matching. In Proceedings of Workshop on Natural Language Processing in E-Commerce, pages 66–75, Barcelona, Spain. Association for Computational Linguistics.
- Cite (Informal):
- BERT-based similarity learning for product matching (Tracz et al., EcomNLP 2020)
- PDF:
- https://preview.aclanthology.org/icon-24-ingestion/2020.ecomnlp-1.7.pdf