Abstract
Open-world Relation Extraction (OpenRE) has recently garnered significant attention. However, existing approaches tend to oversimplify the problem by assuming that all instances of unlabeled data belong to novel classes, thereby limiting the practicality of these methods. We argue that the OpenRE setting should be more aligned with the characteristics of real-world data. Specifically, we propose two key improvements: (a) unlabeled data should encompass known and novel classes, including negative instances; and (b) the set of novel classes should represent long-tail relation types. Furthermore, we observe that popular relations can often be implicitly inferred through specific patterns, while long-tail relations tend to be explicitly expressed. Motivated by these insights, we present a method called KNoRD (Known and Novel Relation Discovery), which effectively classifies explicitly and implicitly expressed relations from known and novel classes within unlabeled data. Experimental evaluations on several Open-world RE benchmarks demonstrate that KNoRD consistently outperforms other existing methods, achieving significant performance gains.- Anthology ID:
- 2023.emnlp-main.880
- Volume:
- Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 14227–14242
- Language:
- URL:
- https://aclanthology.org/2023.emnlp-main.880
- DOI:
- 10.18653/v1/2023.emnlp-main.880
- Cite (ACL):
- William Hogan, Jiacheng Li, and Jingbo Shang. 2023. Open-world Semi-supervised Generalized Relation Discovery Aligned in a Real-world Setting. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 14227–14242, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Open-world Semi-supervised Generalized Relation Discovery Aligned in a Real-world Setting (Hogan et al., EMNLP 2023)
- PDF:
- https://preview.aclanthology.org/landing_page/2023.emnlp-main.880.pdf