Seeing eye to eye: trustworthy embodiment for task-based conversational agents

被引:1
|
作者
Robb, David A. [1 ]
Lopes, Jose [1 ,2 ]
Ahmad, Muneeb I. [3 ]
Mckenna, Peter E. [4 ]
Liu, Xingkun [1 ]
Lohan, Katrin [5 ]
Hastie, Helen [1 ,6 ]
机构
[1] Heriot Watt Univ, Dept Comp Sci, Edinburgh, Scotland
[2] Semasio, Porto, Portugal
[3] Swansea Univ, Dept Comp Sci, Swansea, Wales
[4] Heriot Watt Univ, Dept Psychol, Edinburgh, Scotland
[5] Eastern Switzerland Univ Appl Sci, St Gallen, Switzerland
[6] Univ Edinburgh, Sch Informat, Edinburgh, Scotland
来源
基金
英国工程与自然科学研究理事会;
关键词
conversational agent; remote robots; autonomous systems; human-robot teaming; social robotics; user engagement; cognitive load; TRUST; ROBOT; ENGAGEMENT; GAZE; LOAD;
D O I
10.3389/frobt.2023.1234767
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Smart speakers and conversational agents have been accepted into our homes for a number of tasks such as playing music, interfacing with the internet of things, and more recently, general chit-chat. However, they have been less readily accepted in our workplaces. This may be due to data privacy and security concerns that exist with commercially available smart speakers. However, one of the reasons for this may be that a smart speaker is simply too abstract and does not portray the social cues associated with a trustworthy work colleague. Here, we present an in-depth mixed method study, in which we investigate this question of embodiment in a serious task-based work scenario of a first responder team. We explore the concepts of trust, engagement, cognitive load, and human performance using a humanoid head style robot, a commercially available smart speaker, and a specially developed dialogue manager. Studying the effect of embodiment on trust, being a highly subjective and multi-faceted phenomena, is clearly challenging, and our results indicate that potentially, the robot, with its anthropomorphic facial features, expressions, and eye gaze, was trusted more than the smart speaker. In addition, we found that embodying a conversational agent helped increase task engagement and performance compared to the smart speaker. This study indicates that embodiment could potentially be useful for transitioning conversational agents into the workplace, and further in situ, "in the wild" experiments with domain workers could be conducted to confirm this.
引用
收藏
页数:18
相关论文
共 50 条
  • [41] Breaking the Ice in Human-Agent Communication: Eye-Gaze Based Initiation of Contact with an Embodied Conversational Agent
    Bee, Nikolaus
    Andre, Elisabeth
    Tober, Susanne
    [J]. INTELLIGENT VIRTUAL AGENTS, PROCEEDINGS, 2009, 5773 : 229 - 242
  • [42] Web Camera Based Eye Tracking to Assess Visual Memory on a Visual Paired Comparison Task
    Bott, Nicholas T.
    Lange, Alex
    Rentz, Dorene
    Buffalo, Elizabeth
    Clopton, Paul
    Zola, Stuart
    [J]. FRONTIERS IN NEUROSCIENCE, 2017, 11 : 1 - 9
  • [43] Falling Mechanism during a Dual Task Based on Eye Movement and Frontal Blood Flow in the Elderly
    Yamada, Kazumasa
    Furukawa, Kiminobu
    Hayashi, Iemasa
    Kaiko, Takuma
    Miyahara, Yuta
    Kimura, Daisuke
    [J]. INTERNATIONAL JOURNAL OF GERONTOLOGY, 2023, 17 (02) : 99 - 104
  • [44] Assessing situation awareness in an unmanned vehicle control task A case for eye tracking based metrics
    Coyne, Joseph T.
    Sibley, Ciara M.
    Monfort, Samuel S.
    [J]. ADVANCES IN AVIATION PSYCHOLOGY, VOL 2: USING SCIENTIFIC METHODS TO ADDRESS PRACTICAL HUMAN FACTORS NEEDS, 2017, : 217 - 236
  • [45] Eye-Gaze-Based Intention Recognition for Selection Task by Using SVM-RF
    Wang, Shuai
    Niu, Hongwei
    Wei, Wanni
    Yang, Xiaonan
    Zhang, Shuoyang
    Ai, Mingyu
    [J]. HUMAN-COMPUTER INTERACTION, PT V, HCI 2024, 2024, 14688 : 157 - 168
  • [46] Mechanistic-based non-animal assessment of eye toxicity: Inflammatory profile of human keratinocytes cells after exposure to eye damage/irritant agents
    Garcia da Silva, Artur Christian
    Chialchia, Adrienny Rodrigues
    de Avila, Renato Ivan
    Valadares, Marize Campos
    [J]. CHEMICO-BIOLOGICAL INTERACTIONS, 2018, 292 : 1 - 8
  • [47] Beyond Face Value: Assessing the Factor Structure of an Eye-Tracking Based Attention Bias Task
    McNamara, Mary E.
    Hsu, Kean J.
    McSpadden, Bryan A.
    Risom, Semeon
    Shumake, Jason
    Beevers, Christopher G.
    [J]. COGNITIVE THERAPY AND RESEARCH, 2023, 47 (05) : 772 - 787
  • [48] Evaluating the Usability of mHealth Apps: An Evaluation Model Based on Task Analysis Methods and Eye Movement Data
    Shen, Yichun
    Wang, Shuyi
    Shen, Yuhan
    Tan, Shulian
    Dong, Yue
    Qin, Wei
    Zhuang, Yiwei
    [J]. HEALTHCARE, 2024, 12 (13)
  • [49] Task-independent robotic uncalibrated hand-eye coordination based on the extended state observer
    Su, JB
    Ma, HY
    Qiu, WB
    Xi, YG
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2004, 34 (04): : 1917 - 1922
  • [50] A Preliminary Investigation on Eye Gaze-based Estimation of Self-efficacy during a Dexterity Task
    Hayakawa, Yuka
    Tanaka, Saki
    Tsuji, Airi
    Fujinami, Kaori
    Yamamoto, Junichi
    [J]. ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, ETRA 2023, 2023,