A Multi-Gate Encoder for Joint Entity and Relation Extraction

Xiong Xiong, Liu Yunfei, Liu Anqi, Gong Shuai, Li Shengyang


Abstract
“Named entity recognition and relation extraction are core sub-tasks of relational triple extraction. Recent studies have used parameter sharing or joint decoding to create interaction between these two tasks. However, ensuring the specificity of task-specific traits while the two tasks interact properly is a huge difficulty. We propose a multi-gate encoder that models bidirectional task interaction while keeping sufficient feature specificity based on gating mechanism in this paper. Precisely, we design two types of independent gates: task gates to generate task-specific features and interaction gates to generate instructive features to guide the opposite task. Our experiments show that our method increases the state-of-the-art (SOTA) relation F1 scores on ACE04, ACE05 and SciERC datasets to 63.8% (+1.3%), 68.2% (+1.4%), 39.4% (+1.0%), respectively, with higher inference speed over previous SOTA model.”
Anthology ID:
2022.ccl-1.75
Volume:
Proceedings of the 21st Chinese National Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Nanchang, China
Editors:
Maosong Sun (孙茂松), Yang Liu (刘洋), Wanxiang Che (车万翔), Yang Feng (冯洋), Xipeng Qiu (邱锡鹏), Gaoqi Rao (饶高琦), Yubo Chen (陈玉博)
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
848–860
Language:
English
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2022.ccl-1.75/
DOI:
Bibkey:
Cite (ACL):
Xiong Xiong, Liu Yunfei, Liu Anqi, Gong Shuai, and Li Shengyang. 2022. A Multi-Gate Encoder for Joint Entity and Relation Extraction. In Proceedings of the 21st Chinese National Conference on Computational Linguistics, pages 848–860, Nanchang, China. Chinese Information Processing Society of China.
Cite (Informal):
A Multi-Gate Encoder for Joint Entity and Relation Extraction (Xiong et al., CCL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2022.ccl-1.75.pdf
Data
ACE 2005SciERC