Dingyao Yu


2023

pdf
Improving Knowledge Graph Completion with Generative Hard Negative Mining
Zile Qiao | Wei Ye | Dingyao Yu | Tong Mo | Weiping Li | Shikun Zhang
Findings of the Association for Computational Linguistics: ACL 2023

Contrastive learning has recently shown great potential to improve text-based knowledge graph completion (KGC). In this paper, we propose to learn a more semantically structured entity representation space in text-based KGC via hard negatives mining. Specifically, we novelly leverage a sequence-to-sequence architecture to generate high-quality hard negatives. These negatives are sampled from the same decoding distributions as the anchor (or correct entity), inherently being semantically close to the anchor and thus enjoying good hardness. A self-information-enhanced contrasting strategy is further incorporated into the Seq2Seq generator to systematically diversify the produced negatives. Extensive experiments on three KGC benchmarks demonstrate the sound hardness and diversity of our generated negatives and the resulting performance superiority on KGC.