The Role of Ontologies and Knowledge in Explainable AI

Tracking #: 3529-4743

Authors: 
Roberto Confalonieri
Oliver Kutz
Diego Calvanese
Jose Maria Alonso-Moral
Shang-Ming Zhou

Responsible editor: 
Cogan Shimizu

Submission type: 
Editorial
Abstract: 
This is the editorial for the `The Role of Ontologies and Knowledge in Explainable AI' special issue. The accepted papers leveraged ontologies, knowledge graphs, and knowledge representation and reasoning in eXlainable AI (XAI). The papers can be classified into two distinct groups. One set of papers focused on proposing ontology specifications and extensions to enhance the conceptualization of user-centered explainable systems across various application domains, including chemistry, cyberbullying, finance, and data science. These papers introduced domain-specific ontologies, providing a structured framework to facilitate understanding and explanation of the systems within each domain. The other group of papers took a more foundational approach by presenting logic-based methodologies that fostered the development of explainable-by-design systems. These papers emphasized the use of logical reasoning techniques to achieve explainability and offered frameworks for constructing systems that inherently prioritize interpretability. In summary, the accepted papers demonstrated the utilization of ontologies, knowledge graphs, and knowledge representation and reasoning in advancing the field of XAI.
Full PDF Version: 
Tags: 
Reviewed

Decision/Status: 
Accept