Dangling-Aware Entity Alignment with Mixed High-Order Proximities
Juncheng Liu, Zequn Sun, Bryan Hooi, Yiwei Wang, Dayiheng Liu, Baosong Yang, Xiaokui Xiao, Muhao Chen
Abstract
We study dangling-aware entity alignment in knowledge graphs (KGs), which is an underexplored but important problem. As different KGs are naturally constructed by different sets of entities, a KG commonly contains some dangling entities that cannot find counterparts in other KGs. Therefore, dangling-aware entity alignment is more realistic than the conventional entity alignment where prior studies simply ignore dangling entities. We propose a framework using mixed high-order proximities on dangling-aware entity alignment. Our framework utilizes both the local high-order proximity in a nearest neighbor subgraph and the global high-order proximity in an embedding space for both dangling detection and entity alignment. Extensive experiments with two evaluation settings shows that our method more precisely detects dangling entities, and better aligns matchable entities. Further investigations demonstrate that our framework can mitigate the hubness problem on dangling-aware entity alignment.- Anthology ID:
- 2022.findings-naacl.88
- Volume:
- Findings of the Association for Computational Linguistics: NAACL 2022
- Month:
- July
- Year:
- 2022
- Address:
- Seattle, United States
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1172–1184
- Language:
- URL:
- https://aclanthology.org/2022.findings-naacl.88
- DOI:
- 10.18653/v1/2022.findings-naacl.88
- Cite (ACL):
- Juncheng Liu, Zequn Sun, Bryan Hooi, Yiwei Wang, Dayiheng Liu, Baosong Yang, Xiaokui Xiao, and Muhao Chen. 2022. Dangling-Aware Entity Alignment with Mixed High-Order Proximities. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 1172–1184, Seattle, United States. Association for Computational Linguistics.
- Cite (Informal):
- Dangling-Aware Entity Alignment with Mixed High-Order Proximities (Liu et al., Findings 2022)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/2022.findings-naacl.88.pdf