Abstract
For extreme multi-label classification (XMC), existing classification-based models poorly per- form for tail labels and often ignore the semantic relations among labels, like treating”Wikipedia” and “Wiki” as independent and separate labels. In this paper, we cast XMC as a generation task (XLGen), where we benefit from pre-trained text-to-text models. However, generating labels from the extremely large label space is challenging without any constraints or guidance. We, therefore, propose to guide label generation using label cluster information to hierarchically generate lower-level labels. We also find that frequency-based label ordering and using decoding ensemble methods are critical factors for the improvements in XLGen. XLGen with cluster guidance significantly outperforms the classification and generation baselines on tail labels, and also generally improves the overall performance in four popular XMC benchmarks. In human evaluation, we also find XLGen generates unseen but plausible labels. Our code is now available at https://github.com/alexa/xlgen-eacl-2023.- Anthology ID:
- 2023.eacl-main.122
- Volume:
- Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
- Month:
- May
- Year:
- 2023
- Address:
- Dubrovnik, Croatia
- Editors:
- Andreas Vlachos, Isabelle Augenstein
- Venue:
- EACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1670–1685
- Language:
- URL:
- https://aclanthology.org/2023.eacl-main.122
- DOI:
- 10.18653/v1/2023.eacl-main.122
- Cite (ACL):
- Taehee Jung, Joo-kyung Kim, Sungjin Lee, and Dongyeop Kang. 2023. Cluster-Guided Label Generation in Extreme Multi-Label Classification. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 1670–1685, Dubrovnik, Croatia. Association for Computational Linguistics.
- Cite (Informal):
- Cluster-Guided Label Generation in Extreme Multi-Label Classification (Jung et al., EACL 2023)
- PDF:
- https://preview.aclanthology.org/fix-dup-bibkey/2023.eacl-main.122.pdf