Evaluating the Effects of Explanaible AI techniques/ explanation styles for Human Decision Support in the Context of Predictive Process Monitoring
Summary
Predictive Process Monitoring (PPM) leverages machine learning to forecast future behaviours of ongoing processes to support decision-making across various domains. However, the complexity and opacity of these machine learning models, often termed "black box" models, challenge user interpretability and trust, leading to the development of Explainable AI (XAI) techniques. Despite recent efforts to integrate XAI into the PPM field, it remains unclear how effectively these techniques explain predictions to users and support their decision-making processes, highlighting a gap in understanding the effects of explanations on user behaviour.
This thesis addresses this gap by investigating the impact of explanation styles and perceived AI accuracy on user decision-making within the PPM domain. An empirical user evaluation was conducted to assess the effectiveness of three explanation styles—Feature importance-based, Rule-based, and Counterfactual-based—in influencing task performance, agreement, and decision confidence in decision-making tasks related to loan application outcomes.
The results demonstrate that perceived AI accuracy significantly influences decision-making, with lower perceived accuracy linked to higher task performance across explanation styles. Counterfactual explanations were particularly effective in enhancing task performance and agreement, whereas Feature importance-based explanations resulted in the lowest agreement levels. Conversely, Rule-based explanations led to the highest satisfaction and decision confidence compared to Feature importance-based explanations. These findings show the importance of user evaluations in assessing the effectiveness of XAI explanations. This research contributes to developing more user-centred and interpretable AI systems by providing insights into how explanation styles and perceived accuracy shape user trust and engagement.