SENT: Sentence-level Distant Relation Extraction via Negative Training

Ruotian Ma, Tao Gui, Linyang Li, Qi Zhang, Xuanjing Huang, Yaqian Zhou


Abstract
Distant supervision for relation extraction provides uniform bag labels for each sentence inside the bag, while accurate sentence labels are important for downstream applications that need the exact relation type. Directly using bag labels for sentence-level training will introduce much noise, thus severely degrading performance. In this work, we propose the use of negative training (NT), in which a model is trained using complementary labels regarding that “the instance does not belong to these complementary labels”. Since the probability of selecting a true label as a complementary label is low, NT provides less noisy information. Furthermore, the model trained with NT is able to separate the noisy data from the training data. Based on NT, we propose a sentence-level framework, SENT, for distant relation extraction. SENT not only filters the noisy data to construct a cleaner dataset, but also performs a re-labeling process to transform the noisy data into useful training data, thus further benefiting the model’s performance. Experimental results show the significant improvement of the proposed method over previous methods on sentence-level evaluation and de-noise effect.
Anthology ID:
2021.acl-long.484
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6201–6213
Language:
URL:
https://aclanthology.org/2021.acl-long.484
DOI:
10.18653/v1/2021.acl-long.484
Bibkey:
Cite (ACL):
Ruotian Ma, Tao Gui, Linyang Li, Qi Zhang, Xuanjing Huang, and Yaqian Zhou. 2021. SENT: Sentence-level Distant Relation Extraction via Negative Training. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 6201–6213, Online. Association for Computational Linguistics.
Cite (Informal):
SENT: Sentence-level Distant Relation Extraction via Negative Training (Ma et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/2021.acl-long.484.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-1/2021.acl-long.484.mp4
Code
 rtmaww/SENT