Yikai Guo
2023
HAHE: Hierarchical Attention for Hyper-Relational Knowledge Graphs in Global and Local Level
Haoran Luo
|
Haihong E
|
Yuhao Yang
|
Yikai Guo
|
Mingzhi Sun
|
Tianyu Yao
|
Zichen Tang
|
Kaiyang Wan
|
Meina Song
|
Wei Lin
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Link Prediction on Hyper-relational Knowledge Graphs (HKG) is a worthwhile endeavor. HKG consists of hyper-relational facts (H-Facts), composed of a main triple and several auxiliary attribute-value qualifiers, which can effectively represent factually comprehensive information. The internal structure of HKG can be represented as a hypergraph-based representation globally and a semantic sequence-based representation locally. However, existing research seldom simultaneously models the graphical and sequential structure of HKGs, limiting HKGs’ representation. To overcome this limitation, we propose a novel Hierarchical Attention model for HKG Embedding (HAHE), including global-level and local-level attention. The global-level attention can model the graphical structure of HKG using hypergraph dual-attention layers, while the local-level attention can learn the sequential structure inside H-Facts via heterogeneous self-attention layers. Experiment results indicate that HAHE achieves state-of-the-art performance in link prediction tasks on HKG standard datasets. In addition, HAHE addresses the issue of HKG multi-position prediction for the first time, increasing the applicability of the HKG link prediction task. Our code is publicly available.
Search
Co-authors
- Haoran Luo 1
- Haihong E 1
- Yuhao Yang 1
- Mingzhi Sun 1
- Tianyu Yao 1
- show all...
Venues
- acl1