Towards Open-World Product Attribute Mining: A Lightly-Supervised Approach

Liyan Xu, Chenwei Zhang, Xian Li, Jingbo Shang, Jinho D. Choi


Abstract
We present a new task setting for attribute mining on e-commerce products, serving as a practical solution to extract open-world attributes without extensive human intervention. Our supervision comes from a high-quality seed attribute set bootstrapped from existing resources, and we aim to expand the attribute vocabulary of existing seed types, and also to discover any new attribute types automatically. A new dataset is created to support our setting, and our approach Amacer is proposed specifically to tackle the limited supervision. Especially, given that no direct supervision is available for those unseen new attributes, our novel formulation exploits self-supervised heuristic and unsupervised latent attributes, which attains implicit semantic signals as additional supervision by leveraging product context. Experiments suggest that our approach surpasses various baselines by 12 F1, expanding attributes of existing types significantly by up to 12 times, and discovering values from 39% new types.
Anthology ID:
2023.acl-long.683
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12223–12239
Language:
URL:
https://aclanthology.org/2023.acl-long.683
DOI:
10.18653/v1/2023.acl-long.683
Bibkey:
Cite (ACL):
Liyan Xu, Chenwei Zhang, Xian Li, Jingbo Shang, and Jinho D. Choi. 2023. Towards Open-World Product Attribute Mining: A Lightly-Supervised Approach. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 12223–12239, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Towards Open-World Product Attribute Mining: A Lightly-Supervised Approach (Xu et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2023.acl-long.683.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-2/2023.acl-long.683.mp4