A Unified Representation Learning Strategy for Open Relation Extraction with Ranked List Loss

Lou Renze, Zhang Fan, Zhou Xiaowei, Wang Yutong, Wu Minghui, Sun Lin


Abstract
Open Relation Extraction (OpenRE) aiming to extract relational facts from open-domain cor-pora is a sub-task of Relation Extraction and a crucial upstream process for many other NLPtasks. However various previous clustering-based OpenRE strategies either confine themselves to unsupervised paradigms or can not directly build a unified relational semantic space henceimpacting down-stream clustering. In this paper we propose a novel supervised learning frame-work named MORE-RLL (Metric learning-based Open Relation Extraction with Ranked ListLoss) to construct a semantic metric space by utilizing Ranked List Loss to discover new rela-tional facts. Experiments on real-world datasets show that MORE-RLL can achieve excellent performance compared with previous state-of-the-art methods demonstrating the capability of MORE-RLL in unified semantic representation learning and novel relational fact detection.
Anthology ID:
2021.ccl-1.98
Volume:
Proceedings of the 20th Chinese National Conference on Computational Linguistics
Month:
August
Year:
2021
Address:
Huhhot, China
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
1096–1108
Language:
English
URL:
https://aclanthology.org/2021.ccl-1.98
DOI:
Bibkey:
Cite (ACL):
Lou Renze, Zhang Fan, Zhou Xiaowei, Wang Yutong, Wu Minghui, and Sun Lin. 2021. A Unified Representation Learning Strategy for Open Relation Extraction with Ranked List Loss. In Proceedings of the 20th Chinese National Conference on Computational Linguistics, pages 1096–1108, Huhhot, China. Chinese Information Processing Society of China.
Cite (Informal):
A Unified Representation Learning Strategy for Open Relation Extraction with Ranked List Loss (Renze et al., CCL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2021.ccl-1.98.pdf
Data
FewRel