GenerativeRE: Incorporating a Novel Copy Mechanism and Pretrained Model for Joint Entity and Relation Extraction

Jiarun Cao, Sophia Ananiadou


Abstract
Previous neural Seq2Seq models have shown the effectiveness for jointly extracting relation triplets. However, most of these models suffer from incompletion and disorder problems when they extract multi-token entities from input sentences. To tackle these problems, we propose a generative, multi-task learning framework, named GenerativeRE. We firstly propose a special entity labelling method on both input and output sequences. During the training stage, GenerativeRE fine-tunes the pre-trained generative model and learns the special entity labels simultaneously. During the inference stage, we propose a novel copy mechanism equipped with three mask strategies, to generate the most probable tokens by diminishing the scope of the model decoder. Experimental results show that our model achieves 4.6% and 0.9% F1 score improvements over the current state-of-the-art methods in the NYT24 and NYT29 benchmark datasets respectively.
Anthology ID:
2021.findings-emnlp.182
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2119–2126
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.182
DOI:
10.18653/v1/2021.findings-emnlp.182
Bibkey:
Cite (ACL):
Jiarun Cao and Sophia Ananiadou. 2021. GenerativeRE: Incorporating a Novel Copy Mechanism and Pretrained Model for Joint Entity and Relation Extraction. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 2119–2126, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
GenerativeRE: Incorporating a Novel Copy Mechanism and Pretrained Model for Joint Entity and Relation Extraction (Cao & Ananiadou, Findings 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2021.findings-emnlp.182.pdf
Video:
 https://preview.aclanthology.org/naacl-24-ws-corrections/2021.findings-emnlp.182.mp4