Text Classification with Negative Supervision

Sora Ohashi, Junya Takayama, Tomoyuki Kajiwara, Chenhui Chu, Yuki Arase


Abstract
Advanced pre-trained models for text representation have achieved state-of-the-art performance on various text classification tasks. However, the discrepancy between the semantic similarity of texts and labelling standards affects classifiers, i.e. leading to lower performance in cases where classifiers should assign different labels to semantically similar texts. To address this problem, we propose a simple multitask learning model that uses negative supervision. Specifically, our model encourages texts with different labels to have distinct representations. Comprehensive experiments show that our model outperforms the state-of-the-art pre-trained model on both single- and multi-label classifications, sentence and document classifications, and classifications in three different languages.
Anthology ID:
2020.acl-main.33
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
351–357
Language:
URL:
https://aclanthology.org/2020.acl-main.33
DOI:
10.18653/v1/2020.acl-main.33
Bibkey:
Cite (ACL):
Sora Ohashi, Junya Takayama, Tomoyuki Kajiwara, Chenhui Chu, and Yuki Arase. 2020. Text Classification with Negative Supervision. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 351–357, Online. Association for Computational Linguistics.
Cite (Informal):
Text Classification with Negative Supervision (Ohashi et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2020.acl-main.33.pdf
Video:
 http://slideslive.com/38929209