Contrastive Learning-Enhanced Nearest Neighbor Mechanism for Multi-Label Text Classification

Xi’ao Su, Ran Wang, Xinyu Dai


Abstract
Multi-Label Text Classification (MLTC) is a fundamental and challenging task in natural language processing. Previous studies mainly focus on learning text representation and modeling label correlation but neglect the rich knowledge from the existing similar instances when predicting labels of a specific text. To make up for this oversight, we propose a k nearest neighbor (kNN) mechanism which retrieves several neighbor instances and interpolates the model output with their labels. Moreover, we design a multi-label contrastive learning objective that makes the model aware of the kNN classification process and improves the quality of the retrieved neighbors while inference. Extensive experiments show that our method can bring consistent and significant performance improvement to multiple MLTC models including the state-of-the-art pretrained and non-pretrained ones.
Anthology ID:
2022.acl-short.75
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
672–679
Language:
URL:
https://aclanthology.org/2022.acl-short.75
DOI:
10.18653/v1/2022.acl-short.75
Bibkey:
Cite (ACL):
Xi’ao Su, Ran Wang, and Xinyu Dai. 2022. Contrastive Learning-Enhanced Nearest Neighbor Mechanism for Multi-Label Text Classification. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 672–679, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Contrastive Learning-Enhanced Nearest Neighbor Mechanism for Multi-Label Text Classification (Su et al., ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2022.acl-short.75.pdf
Data
RCV1