Trust in and Acceptance of Artificial Intelligence Applications in Medicine: Mixed Methods Study

被引:0
|
作者
Shevtsova, Daria [1 ,2 ]
Ahmed, Anam [1 ]
Boot, Iris W. A. [1 ]
Sanges, Carmen [3 ]
Hudecek, Michael [3 ]
Jacobs, John J. L. [4 ]
Hort, Simon [5 ]
Vrijhoef, Hubertus J. M. [1 ]
机构
[1] Panaxea Bv, Pettelaarpk 84, NL-5216 PP Den Bosch, Netherlands
[2] Vrije Univ Amsterdam, Amsterdam, Netherlands
[3] Univ Klinikum Wurzburg, Wurzburg, Germany
[4] Ortec Bv, Zoetermeer, Netherlands
[5] Fraunhofer Inst Prod Technol, Aachen, Germany
来源
JMIR HUMAN FACTORS | 2024年 / 11卷
基金
欧盟地平线“2020”;
关键词
trust; acceptance; artificial intelligence; medicine; mixed methods; rapid review; survey;
D O I
10.2196/47031
中图分类号
R19 [保健组织与事业(卫生事业管理)];
学科分类号
摘要
Background: Artificial intelligence (AI)-powered technologies are being increasingly used in almost all fields, including medicine. However, to successfully implement medical AI applications, ensuring trust and acceptance toward such technologies is crucial for their successful spread and timely adoption worldwide. Although AI applications in medicine provide advantages to the current health care system, there are also various associated challenges regarding, for instance, data privacy, accountability, and equity and fairness, which could hinder medical AI application implementation. Objective: The aim of this study was to identify factors related to trust in and acceptance of novel AI-powered medical technologies and to assess the relevance of those factors among relevant stakeholders. Methods: This study used a mixed methods design. First, a rapid review of the existing literature was conducted, aiming to identify various factors related to trust in and acceptance of novel AI applications in medicine. Next, an electronic survey including the rapid review-derived factors was disseminated among key stakeholder groups. Participants (N=22) were asked to assess on a 5-point Likert scale (1=irrelevant to 5=relevant) to what extent they thought the various factors (N=19) were relevant to trust in and acceptance of novel AI applications in medicine. Results: The rapid review (N=32 papers) yielded 110 factors related to trust and 77 factors related to acceptance toward AI technology in medicine. Closely related factors were assigned to 1 of the 19 overarching umbrella factors, which were further grouped into 4 categories: human-related (ie, the type of institution AI professionals originate from), technology-related (ie, the explainability and transparency of AI application processes and outcomes), ethical and legal (ie, data use transparency), and additional factors (ie, AI applications being environment friendly). The categorized 19 umbrella factors were presented as survey statements, which were evaluated by relevant stakeholders. Survey participants (N=22) represented researchers (n=18, 82%), technology providers (n=5, 23%), hospital staff (n=3, 14%), and policy makers (n=3, 14%). Of the 19 factors, 16 (84%) human-related, technology-related, ethical and legal, and additional factors were considered to be of high relevance to trust in and acceptance of novel AI applications in medicine. The patient's gender, age, and education level were found to be of low relevance (3/19, 16%). Conclusions: The results of this study could help the implementers of medical AI applications to understand what drives trust and acceptance toward AI-powered technologies among key stakeholders in medicine. Consequently, this would allow the implementers to identify strategies that facilitate trust in and acceptance of medical AI applications among key stakeholders and potential users.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] Theory of trust and acceptance of artificial intelligence technology (TrAAIT): An instrument to assess clinician trust and acceptance of artificial intelligence
    Stevens, Alexander F.
    Stetson, Pete
    [J]. JOURNAL OF BIOMEDICAL INFORMATICS, 2023, 148
  • [2] Trust and Success of Artificial Intelligence in Medicine
    Miklavcic, Jonas
    [J]. BOGOSLOVNI VESTNIK-THEOLOGICAL QUARTERLY-EPHEMERIDES THEOLOGICAE, 2021, 81 (04): : 935 - 946
  • [3] Perceptions of US Medical Students on Artificial Intelligence in Medicine: Mixed Methods Survey Study
    Liu, David Shalom
    Sawyer, Jake
    Luna, Alexander
    Aoun, Jihad
    Wang, Janet
    Boachie, Lord
    Halabi, Safwan
    Joe, Bina
    [J]. JMIR MEDICAL EDUCATION, 2022, 8 (04):
  • [4] Modeling the influence of attitudes, trust, and beliefs on endoscopists' acceptance of artificial intelligence applications in medical practice
    Schulz, Peter J.
    Lwin, May O.
    Kee, Kalya M.
    Goh, Wilson W. B.
    Lam, Thomas Y. T.
    Sung, Joseph J. Y.
    [J]. FRONTIERS IN PUBLIC HEALTH, 2023, 11
  • [5] The importance of study design in the application of artificial intelligence methods in medicine
    Eklund, Martin
    Kartasalo, Kimmo
    Olsson, Henrik
    Strom, Peter
    [J]. NPJ DIGITAL MEDICINE, 2019, 2 (1)
  • [6] Artificial Intelligence Applications in Space Medicine
    Cheung, Hoi Ching
    De Louche, Calvin
    Komorowski, Matthieu
    [J]. AEROSPACE MEDICINE AND HUMAN PERFORMANCE, 2023, 94 (08) : 610 - 622
  • [7] Applications of Artificial Intelligence in Pain Medicine
    Abd-Elsayed, Alaa
    Robinson, Christopher L.
    Marshall, Zwade
    Diwan, Sudhir
    Peters, Theodore
    [J]. CURRENT PAIN AND HEADACHE REPORTS, 2024, 28 (04) : 229 - 238
  • [8] Applications of artificial intelligence in emergency medicine
    Grant, Kiran
    McParland, Aidan
    [J]. UNIVERSITY OF TORONTO MEDICAL JOURNAL, 2019, 96 (01) : 37 - 39
  • [9] The importance of study design in the application of artificial intelligence methods in medicine
    Martin Eklund
    Kimmo Kartasalo
    Henrik Olsson
    Peter Ström
    [J]. npj Digital Medicine, 2
  • [10] ARTIFICIAL-INTELLIGENCE - APPLICATIONS IN MEDICINE
    GHOSHAL, TK
    [J]. INSTRUMENTATION - BIOMEDICAL INSTRUMENTATION, 1989, : 45 - 47