Explainable AI for trustworthy intelligent process monitoring: short communication

Statistical control charts are often based on assumptions that do not hold in complex, high-dimensional and dynamic environments. To counter these weaknesses, control charts based on artificial intelligence (AI) techniques have emerged as a powerful alternative in recent years. However, their black-...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Johannssen, Arne (VerfasserIn) , Qiu, Peihua (VerfasserIn) , Yeganeh, Ali (VerfasserIn) , Chukhrova, Nataliya (VerfasserIn)
Dokumenttyp: Article (Journal)
Sprache:Englisch
Veröffentlicht: November 2025
In: Computers & industrial engineering
Year: 2025, Jahrgang: 209, Pages: 1-12
ISSN:0360-8352
DOI:10.1016/j.cie.2025.111407
Online-Zugang:Verlag, kostenfrei, Volltext: https://doi.org/10.1016/j.cie.2025.111407
Verlag, kostenfrei, Volltext: https://www.sciencedirect.com/science/article/pii/S0360835225005534
Volltext
Verfasserangaben:Arne Johannssen, Peihua Qiu, Ali Yeganeh, Nataliya Chukhrova
Beschreibung
Zusammenfassung:Statistical control charts are often based on assumptions that do not hold in complex, high-dimensional and dynamic environments. To counter these weaknesses, control charts based on artificial intelligence (AI) techniques have emerged as a powerful alternative in recent years. However, their black-box nature limits transparency, interpretability and trustworthiness that are essential to realize Industry 5.0. To address that issue, this Short Communication discusses the necessity of embedding explainable artificial intelligence (XAI) in AI-based control charts. Incorporating XAI provides a solution by enhancing the interpretability of AI-based control charts while maintaining their high predictive accuracy. This paper also identifies key challenges in embedding XAI and outlines future research directions for responsible and trustworthy AI-based process monitoring.
Beschreibung:Online verfügbar: 29. Juli 2025, Artikelversion: 11. August 2025
Gesehen am 17.11.2025
Beschreibung:Online Resource
ISSN:0360-8352
DOI:10.1016/j.cie.2025.111407