DegreEmbed: incorporating entity embedding into logic rule learning for knowledge graph reasoning

Tracking #: 3161-4375

Haotian Li
Hongri Liu
Yao Wang
Guodong Xin
Yuliang Wei

Responsible editor: 
Pascal Hitzler

Submission type: 
Full Paper
Knowledge graphs (KGs), as structured representations of real world facts, are intelligent databases incorporating human knowledge that can help machine imitate the way of human problem solving. However, KGs are usually huge and there are inevitably missing facts in KGs, thus undermining applications such as question answering and recommender systems that are based on knowledge graph reasoning. Link prediction for knowledge graphs is the task aiming to complete missing facts by reasoning based on the existing knowledge. Two main streams of research are widely studied: one learns low-dimensional embeddings for entities and relations that can explore latent patterns, and the other gains good interpretability by mining logical rules. Unfortunately, previous studies rarely pay attention to heterogeneous KGs. In this paper, we propose DegreEmbed, a model that combines embedding-based learning and logic rule mining for inferring on KGs. Specifically, we study the problem of predicting missing links in heterogeneous KGs that involve entities and relations of various types from the perspective of the degree of nodes. Experimentally, we demonstrate that our DegreEmbed model outperforms the state-of-the-art methods on real world datasets and the rules mined by our model are of high quality and interpretability.
Full PDF Version: 

Minor Revision

Solicited Reviews:
Click to Expand/Collapse
Review #1
Anonymous submitted on 17/Aug/2022
Minor Revision
Review Comment:

I would like to thank the authors for taking the diligence to address most of our comments. The quality of the paper improved significantly and the contribution is more convincing.
However there are a few concerns that were not addressed yet. For instance, the authors focus on presenting only the results where their approach outperforms the compared methods. Table 7 shows that TuckER outperforms DegreEmbed on FB15K-237. It would be useful to present insights on why this is the case particularly in this KG.

One claim that the author repeated multiple times in the paper is that one of the advantages of their approach is that it is applicable to heterogeneous KGs. " ... which differs from our goal to learn in heterogeneous graphs." "Unfortunately, previous studies rarely pay attention to heterogeneous KGs". Aren't all KGs heterogeneous? It would be beneficial to explain what exactly this claim means and how other approaches are not suitable for heterogeneous KGs.

Review #2
By Aaron Eberhart submitted on 09/Feb/2023
Minor Revision
Review Comment:

This paper proposes a new embedding system for neural sybolic reasoning that aims to support heterogeneous datasets.

I think this paper addresses an interesting topic and is well written after the revision, as well as has a moderately good result. Although it is not spectacular, I think it should be accepted in its current form with one minor addition (see below).

Concluding Remark:
I think it would be helpful to add a short explanation or definition for the term 'heterogeneous'. The example is sufficient I think, and any reader would probably be familiar with the notion so a formal definition may be excessive. However thee current explanation of 'entities and relations of different types mixing up' feels slightly inadequate, since one could interpret 'types' in various ways. Since it plays a key role in the experiment I think that would improve the paper to clarify that briefly.