How Explainable AI Methods Support Data-Driven Decision-Making

Dominik Stoffels*, Susanne Grabl, Thomas Fischer, Marina Fiedler

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingsConference contributionpeer-review

Abstract

Explainable AI (XAI) holds great potential to reveal the patterns in black-box AI models and to support data-driven decision-making. We apply four post-hoc explanatory methods to demonstrate the explanatory capabilities of these methods for data-driven decision-making using the illustrative example of unwanted job turnover and human resource management (HRM) support. We show that XAI can be a useful aid in data-driven decision-making, but also highlight potential drawbacks and limitations of which users in research and practice should be aware.

Original languageEnglish
Title of host publicationConceptualizing Digital Responsibility for the Information Age - Proceedings of the 18th International Conference on Wirtschaftsinformatik, 2023, Vol. 1
EditorsDaniel Beverungen, Matthias Trier, Christiane Lehrer
PublisherSpringer
Pages325-340
Number of pages16
ISBN (Print)9783031801181
DOIs
Publication statusPublished - 2025
Event18th International Conference on Wirtschaftsinformatik, WI 2023 - Paderborn, Germany
Duration: 18 Sept 202321 Sept 2023

Publication series

NameLecture Notes in Information Systems and Organisation
Volume74
ISSN (Print)2195-4968
ISSN (Electronic)2195-4976

Conference

Conference18th International Conference on Wirtschaftsinformatik, WI 2023
Country/TerritoryGermany
CityPaderborn
Period18.09.202321.09.2023

Keywords

  • Data-driven decision-making
  • Explainable AI
  • Machine learning

Fingerprint

Dive into the research topics of 'How Explainable AI Methods Support Data-Driven Decision-Making'. Together they form a unique fingerprint.

Cite this