Abstract
Knowledge graph completion (KGC), the task of predicting missing information based on the existing relational data inside a knowledge graph (KG), has drawn significant attention in recent years. However, the predictive power of KGC methods is often limited by the completeness of the existing knowledge graphs from different sources and languages. In monolingual and multilingual settings, KGs are potentially complementary to each other. In this paper, we study the problem of multi-KG completion, where we focus on maximizing the collective knowledge from different KGs to alleviate the incompleteness of individual KGs. Specifically, we propose a novel method called CKGC-CKD that uses relation-aware graph convolutional network encoder models on both individual KGs and a large fused KG in which seed alignments between KGs are regarded as edges for message propagation. An additional mutual knowledge distillation mechanism is also employed to maximize the knowledge transfer between the models of "global" fused KG and the "local" individual KGs. Experimental results on multilingual datasets have shown that our method outperforms all state-of-the-art models in the KGC task.
Abstract (translated)
知识图 completion (KGC),即基于知识图(KG)内现有的关系数据预测缺失信息的任务,近年来吸引了大量关注。然而,KGC方法的预测能力往往受到来自不同来源和语言的最新知识图的完整性限制。在单语和多语环境下,KGs可能互相补充。在本文中,我们研究了多KG completion问题,我们重点是最大限度地利用不同KG中的集体知识来减轻个体KG的不完整性。具体来说,我们提出了一种名为CKGC-CKD的新方法,该方法在个体KG和大型合并KG上使用关系 aware 的图卷积神经网络编码模型,并将KG之间的种子对齐视为消息传播的边。此外,我们还采用了额外的互相知识蒸馏机制,以最大限度地促进“全球”合并KG上的“本地”个体KG之间的知识转移。在多语数据集上的实验结果显示,我们的方法在KGC任务中优于所有最先进的模型。
URL
https://arxiv.org/abs/2305.15895