Yujia Gu
2023
ECOLA: Enhancing Temporal Knowledge Embeddings with Contextualized Language Representations
Zhen Han
|
Ruotong Liao
|
Jindong Gu
|
Yao Zhang
|
Zifeng Ding
|
Yujia Gu
|
Heinz Koeppl
|
Hinrich Schütze
|
Volker Tresp
Findings of the Association for Computational Linguistics: ACL 2023
Since conventional knowledge embedding models cannot take full advantage of the abundant textual information, there have been extensive research efforts in enhancing knowledge embedding using texts. However, existing enhancement approaches cannot apply to temporal knowledge graphs (tKGs), which contain time-dependent event knowledge with complex temporal dynamics. Specifically, existing enhancement approaches often assume knowledge embedding is time-independent. In contrast, the entity embedding in tKG models usually evolves, which poses the challenge of aligning temporally relevant texts with entities. To this end, we propose to study enhancing temporal knowledge embedding with textual data in this paper. As an approach to this task, we propose Enhanced Temporal Knowledge Embeddings with Contextualized Language Representations (ECOLA), which takes the temporal aspect into account and injects textual information into temporal knowledge embedding. To evaluate ECOLA, we introduce three new datasets for training and evaluating ECOLA. Extensive experiments show that ECOLA significantly enhances temporal KG embedding models with up to 287% relative improvements regarding Hits@1 on the link prediction task. The code and models are publicly available on https://github.com/mayhugotong/ECOLA.
2021
Learning Neural Ordinary Equations for Forecasting Future Links on Temporal Knowledge Graphs
Zhen Han
|
Zifeng Ding
|
Yunpu Ma
|
Yujia Gu
|
Volker Tresp
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
There has been an increasing interest in inferring future links on temporal knowledge graphs (KG). While links on temporal KGs vary continuously over time, the existing approaches model the temporal KGs in discrete state spaces. To this end, we propose a novel continuum model by extending the idea of neural ordinary differential equations (ODEs) to multi-relational graph convolutional networks. The proposed model preserves the continuous nature of dynamic multi-relational graph data and encodes both temporal and structural information into continuous-time dynamic embeddings. In addition, a novel graph transition layer is applied to capture the transitions on the dynamic graph, i.e., edge formation and dissolution. We perform extensive experiments on five benchmark datasets for temporal KG reasoning, showing our model’s superior performance on the future link forecasting task.
Search
Co-authors
- Zhen Han 2
- Zifeng Ding 2
- Volker Tresp 2
- Ruotong Liao 1
- Jindong Gu 1
- show all...