Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: JMIR Human Factors

Date Submitted: Mar 6, 2023
Date Accepted: Nov 20, 2023

The final, peer-reviewed published version of this preprint can be found here:

Trust in and Acceptance of Artificial Intelligence Applications in Medicine: Mixed Methods Study

Shevtsova D, Ahmed A, Boot IW, Sanges C, Hudecek M, Jacobs JJ, Hort S, Vrijhoef HJ

Trust in and Acceptance of Artificial Intelligence Applications in Medicine: Mixed Methods Study

JMIR Hum Factors 2024;11:e47031

DOI: 10.2196/47031

PMID: 38231544

PMCID: 10831593

Trust in and acceptance of Artificial Intelligence applications in medicine: a mixed methods study

  • Daria Shevtsova; 
  • Anam Ahmed; 
  • Iris WA Boot; 
  • Carmen Sanges; 
  • Michael Hudecek; 
  • John JL Jacobs; 
  • Simon Hort; 
  • Hubertus JM Vrijhoef

ABSTRACT

Background:

Artificial intelligence (AI)-powered technologies have been increasingly used in almost all fields including medicine. However, to successfully implement medical AI applications, ensuring trust and acceptance towards such technologies is crucial for their successful spread and timely adoption worldwide. While AI applications in medicine provide advantages to the current healthcare system, there are also various associated challenges regarding, for instance, data privacy, accountability, and equity and fairness, that could hinder the medical AI implementation.

Objective:

The aim of this study was to identify factors related to trust in and acceptance of novel AI-powered medical technologies, and to assess the relevance of those factors among relevant stakeholders.

Methods:

This study used a mixed-methods design that included a rapid review of existing literature, followed by a stakeholder survey. The rapid review aimed to identify various factors related to trust in or acceptance of AI applications in medicine. Next, an electronic survey including the rapid review-derived factors related to trust in and acceptance of novel AI applications in medicine was disseminated among key stakeholder groups. Participants (N=22) were asked to assess to what extent they thought the various factors (N=19) were relevant for trust in and acceptance of novel AI applications in medicine on a 5-point Likert scale (1=irrelevant, 5=relevant).

Results:

The rapid review (N=32) yielded 110 factors related to trust and 77 factors related to acceptance towards AI technologies in medicine. Closely-related factors were assigned one of the 19 overarching ‘umbrella’ factors that were further grouped into four categories (human-related, technology-related, ethical and legal, and additional factors). The categorized 19 ‘umbrella’ factors were presented as survey statements that were evaluated by relevant stakeholders. Survey participants (N=22) represented researchers (18/22, 82%), technology providers (5/22, 23%), hospital staff (3/22, 14%), and policy makers (3/22, 14%). Of the 19 factors, 84% (16/19) were considered to be of high relevance for the trust in and acceptance of novel AI applications in medicine. Factors of high relevance were human-related factors (i.e., the type of institution AI professionals originate from), technology-related factors (i.e., the explainability and transparency of the AI application processes and outcomes), legal and ethical factors (i.e., data-use transparency), and additional factors (i.e., AI applications being environment-friendly). The patient’s gender, age, and education level were found to be of low relevance (3/19, 16%).

Conclusions:

The results of this study could help the implementers of medical AI applications to understand what drives trust and acceptance towards AI-powered technologies among key stakeholders in medicine. Consequently, this would allow the implementers to identify strategies that facilitate trust and acceptance towards medical AI among key stakeholders and potential users.


 Citation

Please cite as:

Shevtsova D, Ahmed A, Boot IW, Sanges C, Hudecek M, Jacobs JJ, Hort S, Vrijhoef HJ

Trust in and Acceptance of Artificial Intelligence Applications in Medicine: Mixed Methods Study

JMIR Hum Factors 2024;11:e47031

DOI: 10.2196/47031

PMID: 38231544

PMCID: 10831593

Download PDF


Request queued. Please wait while the file is being generated. It may take some time.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.