Distantly Supervised Relation Extraction with Sentence Reconstruction and Knowledge Base Priors

Fenia Christopoulou, Makoto Miwa, Sophia Ananiadou


Abstract
We propose a multi-task, probabilistic approach to facilitate distantly supervised relation extraction by bringing closer the representations of sentences that contain the same Knowledge Base pairs. To achieve this, we bias the latent space of sentences via a Variational Autoencoder (VAE) that is trained jointly with a relation classifier. The latent code guides the pair representations and influences sentence reconstruction. Experimental results on two datasets created via distant supervision indicate that multi-task learning results in performance benefits. Additional exploration of employing Knowledge Base priors into theVAE reveals that the sentence space can be shifted towards that of the Knowledge Base, offering interpretability and further improving results.
Anthology ID:
2021.naacl-main.2
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11–26
Language:
URL:
https://aclanthology.org/2021.naacl-main.2
DOI:
10.18653/v1/2021.naacl-main.2
Bibkey:
Cite (ACL):
Fenia Christopoulou, Makoto Miwa, and Sophia Ananiadou. 2021. Distantly Supervised Relation Extraction with Sentence Reconstruction and Knowledge Base Priors. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 11–26, Online. Association for Computational Linguistics.
Cite (Informal):
Distantly Supervised Relation Extraction with Sentence Reconstruction and Knowledge Base Priors (Christopoulou et al., NAACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2021.naacl-main.2.pdf
Video:
 https://preview.aclanthology.org/improve-issue-templates/2021.naacl-main.2.mp4