Vol 6, No 2 (2021)

Explainable AI (XAI) for Design Decisions

Abstract

Artificial Intelligence (AI) has become an important support tool in modern design processes, influencing decisions related to form generation, material selection, usability evaluation, and performance optimization. However, many AI-driven design systems operate as black boxes, offering limited insight into how and why specific decisions are produced. This lack of transparency reduces designer trust, makes validation difficult, and raises ethical and accountability concerns. Explainable Artificial Intelligence (XAI) aims to address these challenges by providing interpretable and understandable explanations for AI outputs. This paper presents a comprehensive review of Explainable AI in the context of design decision-making. It discusses the fundamentals of XAI, its relevance to design disciplines, commonly used explanation techniques, and application scenarios across product design, engineering design, and user experience design. The paper also highlights benefits, limitations, and future research directions of XAI-enabled design systems. The review shows that integrating XAI into design workflows can improve trust, collaboration, and informed decision-making, while supporting responsible and human-centered use of AI technologies.

Keywords: Explainable AI, design decisions, human-centered design, interpretable models, decision support systems

 

Full Issue

View or download the full issue PDF 86-97

Table of Contents