Zhitao He
2022
CogKGE: A Knowledge Graph Embedding Toolkit and Benchmark for Representing Multi-source and Heterogeneous Knowledge
Zhuoran Jin
|
Tianyi Men
|
Hongbang Yuan
|
Zhitao He
|
Dianbo Sui
|
Chenhao Wang
|
Zhipeng Xue
|
Yubo Chen
|
Jun Zhao
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: System Demonstrations
In this paper, we propose CogKGE, a knowledge graph embedding (KGE) toolkit, which aims to represent multi-source and heterogeneous knowledge. For multi-source knowledge, unlike existing methods that mainly focus on entity-centric knowledge, CogKGE also supports the representations of event-centric, commonsense and linguistic knowledge. For heterogeneous knowledge, besides structured triple facts, CogKGE leverages additional unstructured information, such as text descriptions, node types and temporal information, to enhance the meaning of embeddings. Designing CogKGE aims to provide a unified programming framework for KGE tasks and a series of knowledge representations for downstream tasks. As a research framework, CogKGE consists of five parts, including core, data, model, knowledge and adapter module. As a knowledge discovery toolkit, CogKGE provides pre-trained embedders to discover new facts, cluster entities and check facts. Furthermore, we construct two benchmark datasets for further research on multi-source heterogeneous KGE tasks: EventKG240K and CogNet360K. We also release an online system to discover knowledge visually. Source code, datasets and pre-trained embeddings are publicly available at GitHub, with a short instruction video.
Search
Co-authors
- Zhuoran Jin 1
- Tianyi Men 1
- Hongbang Yuan 1
- Dianbo Sui 1
- Chenhao Wang 1
- show all...
Venues
- acl1