Explainable AI for trustworthy intelligent process monitoring: short communication

Statistical control charts are often based on assumptions that do not hold in complex, high-dimensional and dynamic environments. To counter these weaknesses, control charts based on artificial intelligence (AI) techniques have emerged as a powerful alternative in recent years. However, their black-...

Full description

Saved in:
Bibliographic Details
Main Authors: Johannssen, Arne (Author) , Qiu, Peihua (Author) , Yeganeh, Ali (Author) , Chukhrova, Nataliya (Author)
Format: Article (Journal)
Language:English
Published: November 2025
In: Computers & industrial engineering
Year: 2025, Volume: 209, Pages: 1-12
ISSN:0360-8352
DOI:10.1016/j.cie.2025.111407
Online Access:Verlag, kostenfrei, Volltext: https://doi.org/10.1016/j.cie.2025.111407
Verlag, kostenfrei, Volltext: https://www.sciencedirect.com/science/article/pii/S0360835225005534
Get full text
Author Notes:Arne Johannssen, Peihua Qiu, Ali Yeganeh, Nataliya Chukhrova
Description
Summary:Statistical control charts are often based on assumptions that do not hold in complex, high-dimensional and dynamic environments. To counter these weaknesses, control charts based on artificial intelligence (AI) techniques have emerged as a powerful alternative in recent years. However, their black-box nature limits transparency, interpretability and trustworthiness that are essential to realize Industry 5.0. To address that issue, this Short Communication discusses the necessity of embedding explainable artificial intelligence (XAI) in AI-based control charts. Incorporating XAI provides a solution by enhancing the interpretability of AI-based control charts while maintaining their high predictive accuracy. This paper also identifies key challenges in embedding XAI and outlines future research directions for responsible and trustworthy AI-based process monitoring.
Item Description:Online verfügbar: 29. Juli 2025, Artikelversion: 11. August 2025
Gesehen am 17.11.2025
Physical Description:Online Resource
ISSN:0360-8352
DOI:10.1016/j.cie.2025.111407