On The Role of Knowledge Graphs in Explainable AI

Tracking #: 2198-3411

This paper is currently under review
Freddy Lecue

Responsible editor: 
Guest Editor 10-years SWJ

Submission type: 
The current hype of Artificial Intelligence (AI) mostly refers to the success of machine learning and its sub-domain of deep learning. However AI is also about other areas such as knowledge representation and reasoning, or distributed AI i.e., areas that need to be combined to reach the level of intelligence initially envisioned in the 1950s. Explainable AI (XAI) is now referring to the core backup for industry to apply AI in products at scale, particularly for industries operating with critical systems. This paper reviews XAI not only from a Machine Learning perspective, but also from the other AI research areas such as AI Planning or Constraint Satisfaction and Search. We expose the XAI challenges of AI fields, their existing approaches, limitations and opportunities for knowledge graphs and their underlying technologies.
Full PDF Version: 
Under Review