Impact of Explainable Artificial Intelligence (XAI) on Data-Driven Decision-Making (D3M)

Authors

  • Zeba, M. Afshar Alam, Harleen Kaur, Ihtiram Raza Khan, Bhavya Alankar

Abstract

Data-driven decision-making is important in driving research through explained and augmented data results. To drive these complex dynamics explainability and monitoring of result are crucial for cognitive understanding. Therefore, decoding applications which supply blind results under the black box landscape is necessary for data-centric extensive decision-making.

To identify data predictions generated through software mining or machine learning agents, there is a need to systematically analyze the interpretations. In this case, the software agents lacking interpretive skills are unable to explain the predicted outcome scenarios. Consequently, the focus of this paper is on interpreting data by using Explainable Artificial Intelligence (XAI) for making high-stakes decisions, or data-driven decision-making, to enable policy scenarios. In this paper, we have examined XAI to support decision-making systems that oversee high-stakes risk data. The conversion of observational data into knowledge, therefore involves the use of cognitive support to both XAI and decision-makers.

Published

2024-04-03

How to Cite

Zeba, M. Afshar Alam, Harleen Kaur, Ihtiram Raza Khan, Bhavya Alankar. (2024). Impact of Explainable Artificial Intelligence (XAI) on Data-Driven Decision-Making (D3M). SJIS-P, 36(1), 58–62. Retrieved from http://sjis.scandinavian-iris.org/index.php/sjis/article/view/788

Issue

Section

Articles