Abstract
This paper explores how the Distantly Supervised Relation Extraction (DS-RE) can benefit from the use of a Universal Graph (UG), the combination of a Knowledge Graph (KG) and a large-scale text collection. A straightforward extension of a current state-of-the-art neural model for DS-RE with a UG may lead to degradation in performance. We first report that this degradation is associated with the difficulty in learning a UG and then propose two training strategies: (1) Path Type Adaptive Pretraining, which sequentially trains the model with different types of UG paths so as to prevent the reliance on a single type of UG path; and (2) Complexity Ranking Guided Attention mechanism, which restricts the attention span according to the complexity of a UG path so as to force the model to extract features not only from simple UG paths but also from complex ones. Experimental results on both biomedical and NYT10 datasets prove the robustness of our methods and achieve a new state-of-the-art result on the NYT10 dataset. The code and datasets used in this paper are available at https://github.com/baodaiqin/UGDSRE.- Anthology ID:
- 2021.eacl-main.321
- Volume:
- Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
- Month:
- April
- Year:
- 2021
- Address:
- Online
- Editors:
- Paola Merlo, Jorg Tiedemann, Reut Tsarfaty
- Venue:
- EACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 3673–3684
- Language:
- URL:
- https://aclanthology.org/2021.eacl-main.321
- DOI:
- 10.18653/v1/2021.eacl-main.321
- Cite (ACL):
- Qin Dai, Naoya Inoue, Ryo Takahashi, and Kentaro Inui. 2021. Two Training Strategies for Improving Relation Extraction over Universal Graph. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 3673–3684, Online. Association for Computational Linguistics.
- Cite (Informal):
- Two Training Strategies for Improving Relation Extraction over Universal Graph (Dai et al., EACL 2021)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/2021.eacl-main.321.pdf
- Code
- baodaiqin/UGDSRE