Your robot therapist is not your therapist: understanding the role of AI-powered mental health chatbots

被引:7
|
作者
Khawaja, Zoha [1 ]
Belisle-Pipon, Jean-Christophe [1 ]
机构
[1] Simon Fraser Univ, Fac Hlth Sci, Burnaby, BC, Canada
来源
关键词
artificial intelligence; chatbot; mental health services; therapeutic misconception; AI ethics; THERAPEUTIC MISCONCEPTION; ETHICS; ISSUES;
D O I
10.3389/fdgth.2023.1278186
中图分类号
R19 [保健组织与事业(卫生事业管理)];
学科分类号
摘要
Artificial intelligence (AI)-powered chatbots have the potential to substantially increase access to affordable and effective mental health services by supplementing the work of clinicians. Their 24/7 availability and accessibility through a mobile phone allow individuals to obtain help whenever and wherever needed, overcoming financial and logistical barriers. Although psychological AI chatbots have the ability to make significant improvements in providing mental health care services, they do not come without ethical and technical challenges. Some major concerns include providing inadequate or harmful support, exploiting vulnerable populations, and potentially producing discriminatory advice due to algorithmic bias. However, it is not always obvious for users to fully understand the nature of the relationship they have with chatbots. There can be significant misunderstandings about the exact purpose of the chatbot, particularly in terms of care expectations, ability to adapt to the particularities of users and responsiveness in terms of the needs and resources/treatments that can be offered. Hence, it is imperative that users are aware of the limited therapeutic relationship they can enjoy when interacting with mental health chatbots. Ignorance or misunderstanding of such limitations or of the role of psychological AI chatbots may lead to a therapeutic misconception (TM) where the user would underestimate the restrictions of such technologies and overestimate their ability to provide actual therapeutic support and guidance. TM raises major ethical concerns that can exacerbate one's mental health contributing to the global mental health crisis. This paper will explore the various ways in which TM can occur particularly through inaccurate marketing of these chatbots, forming a digital therapeutic alliance with them, receiving harmful advice due to bias in the design and algorithm, and the chatbots inability to foster autonomy with patients.
引用
收藏
页数:13
相关论文
共 38 条
  • [1] AI as a Mental Health Therapist for Adolescents
    Opel, Douglas J.
    Kious, Brent M.
    Cohen, I. Glenn
    [J]. JAMA PEDIATRICS, 2023, 177 (12) : 1253 - 1254
  • [2] Can Your Phone Be Your Therapist? Young People's Ethical Perspectives on the Use of Fully Automated Conversational Agents (Chatbots) in Mental Health Support
    Kretzschmar, Kira
    Tyroll, Holly
    Pavarini, Gabriela
    Manzini, Arianna
    Singh, Ilina
    Sharudin, Aysha
    Pavlov, Boris
    Davis, Charlie
    Mooney, Daniel
    Kibble, Eleyna
    Tuckwell, George
    Lewis, Grace
    Heelas, Jasmine
    Dixon, James
    Bransby-Meehan, Jessica
    Katz, Jessica
    Seeney, Laura
    Lee, Angela
    Allegri, Martino
    Beard, Maud
    Aithani, Nav
    Lumbis, Nellie
    Walker, Niahm
    Macfarlane, Poppy
    Bonnett, Samantha
    Martin, Sophie
    Speakman, Sophie
    [J]. BIOMEDICAL INFORMATICS INSIGHTS, 2019, 11
  • [3] Will Your Next Therapist Be a Robot?-A Review of the Advancements in Robotic Upper Extremity Rehabilitation
    Fareh, Raouf
    Elsabe, Ammar
    Baziyad, Mohammed
    Kawser, Tunajjina
    Brahmi, Brahim
    Rahman, Mohammad H.
    [J]. SENSORS, 2023, 23 (11)
  • [4] Role-emerging posts: finding your place as an occupational therapist
    Cookson, K.
    [J]. BRITISH JOURNAL OF OCCUPATIONAL THERAPY, 2013, 76 : 29 - 30
  • [5] The Framework for AI Tool Assessment in Mental Health (FAITA - Mental Health): a scale for evaluating AI-powered mental health tools
    Golden, Ashleigh
    Aboujaoude, Elias
    [J]. WORLD PSYCHIATRY, 2024, 23 (03) : 444 - 445
  • [6] AI-Powered mental health chatbots: Examining users' motivations, active communicative action and engagement after mass-shooting disasters
    Cheng, Yang
    Jiang, Hua
    [J]. JOURNAL OF CONTINGENCIES AND CRISIS MANAGEMENT, 2020, 28 (03) : 339 - 354
  • [7] Learning at Your Fingertips: An Innovative IoT-Based AI-Powered Braille Learning System
    Latif, Ghazanfar
    Ben Brahim, Ghassen
    Abdelhamid, Sherif E.
    Alghazo, Runna
    Alhabib, Ghadah
    Alnujaidi, Khalid
    [J]. APPLIED SYSTEM INNOVATION, 2023, 6 (05)
  • [8] An Artificial Therapist (Manage Your Life Online) to Support the Mental Health of Youth: Co-Design and Case Series
    Wrightson-Hester, Aimee-Rose
    Anderson, Georgia
    Dunstan, Joel
    McEvoy, Peter M.
    Sutton, Christopher J.
    Myers, Bronwyn
    Egan, Sarah
    Tai, Sara
    Johnston-Hollitt, Melanie
    Chen, Wai
    Gedeon, Tom
    Mansell, Warren
    [J]. JMIR HUMAN FACTORS, 2023, 10
  • [10] Your Liberty or Your Gun? A Survey of Psychiatrist Understanding of Mental Health Prohibitors
    Newlon, Cara
    Ayres, Ian
    Barnett, Brian
    [J]. JOURNAL OF LAW MEDICINE & ETHICS, 2020, 48 (4_SUPPL): : 155 - 163