Prototypical Extreme Multi-label Classification with a Dynamic Margin Loss

Kunal Dahiya, Diego Ortego, David Jimenez-Cabello


Abstract
Extreme Multi-label Classification (XMC) methods predict relevant labels for a given query in an extremely large label space. Recent works in XMC address this problem using deep encoders that project text descriptions to an embedding space suitable for recovering the closest labels. However, learning deep models can be computationally expensive in large output spaces, resulting in a trade-off between high performing brute-force approaches and efficient solutions. In this paper, we propose PRIME, a XMC method that employs a novel prototypical contrastive learning technique to reconcile efficiency and performance surpassing brute-force approaches. We frame XMC as a data-to-prototype prediction task where label prototypes aggregate information from related queries. More precisely, we use a shallow transformer encoder that we coin as Label Prototype Network, which enriches label representations by aggregating text-based embeddings, label centroids and learnable free vectors. We jointly train a deep encoder and the Label Prototype Network using an adaptive triplet loss objective that better adapts to the high granularity and ambiguity of extreme label spaces. PRIME achieves state-of-the-art results in several public benchmarks of different sizes and domains, while keeping the model efficient.
Anthology ID:
2025.naacl-long.537
Volume:
Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
April
Year:
2025
Address:
Albuquerque, New Mexico
Editors:
Luis Chiruzzo, Alan Ritter, Lu Wang
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10709–10727
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2025.naacl-long.537/
DOI:
Bibkey:
Cite (ACL):
Kunal Dahiya, Diego Ortego, and David Jimenez-Cabello. 2025. Prototypical Extreme Multi-label Classification with a Dynamic Margin Loss. In Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 10709–10727, Albuquerque, New Mexico. Association for Computational Linguistics.
Cite (Informal):
Prototypical Extreme Multi-label Classification with a Dynamic Margin Loss (Dahiya et al., NAACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2025.naacl-long.537.pdf