Accepted for/Published in: JMIR Human Factors
Date Submitted: Aug 29, 2024
Date Accepted: Jan 5, 2025
Development of AF’fective: an Explainable AI Application to Support Remote Monitoring of Atrial Fibrillation Patients After Catheter Ablation
ABSTRACT
Background:
The opaque nature of AI algorithms has led to distrust in medical contexts, particularly in treating and monitoring Atrial Fibrillation (AF). While previous studies in explainable AI show potential to address this, they often focus solely on ECG factors and lack real-life field insights.
Objective:
We address this gap by integrating extra empirically validated risk factors into the system and involving cardiologists in co-designing and evaluating the approach using real-life patient cases and data.
Methods:
We conducted a 3-stage iterative design with 23 cardiologists to co-design, evaluate, and pilot an explainable AI application. The first stage identified four doctor personas and seven explainable strategies, which were reviewed in the second stage. Four strategies were selected as highly effective and feasible for pilot deployment. A Progressive Web Application (PWA) was then developed based on these four strategies and tested by cardiologists in the third stage.
Results:
The final PWA prototype received above-average UX evaluations and excelled in motivating doctors' intention to use it, thanks to its ease of use, reliable information, and explainable functionality. We also discussed in-depth field insights from cardiologists using the system in clinical contexts.
Conclusions:
Our study identified effective explainable strategies, emphasized curating actionable factors and setting correct expectations, and suggested that many insights could be applicable to other disease care contexts, paving the way for future real-life clinical evaluations.
Citation
Request queued. Please wait while the file is being generated. It may take some time.