共 50 条
Care Robot Transparency Isn't Enough for Trust
被引:0
|作者:
Poulsen, Adam
[1
]
Burmeister, Oliver K.
[1
]
Tien, David
[1
]
机构:
[1] Charles Sturt Univ, Sch Comp & Math, Bathurst, NSW, Australia
来源:
关键词:
machine transparency;
healthcare robotics;
robot ethics;
human robot interaction;
D O I:
暂无
中图分类号:
TM [电工技术];
TN [电子技术、通信技术];
学科分类号:
0808 ;
0809 ;
摘要:
A recent study featuring a new kind of care robot indicated that participants expect a robot's ethical decision-making to be transparent to develop trust, even though the same type of 'inspection of thoughts' isn't expected of a human carer. At first glance, this might suggest that robot transparency mechanisms are required for users to develop trust in robot-made ethical decisions. But the participants were found to desire transparency only when they didn't know the specifics of a human-robot social interaction. Humans trust others without observing their thoughts, which implies other means of determining trustworthiness. The study reported here suggests that the method is social interaction and observation, signifying that trust is a social construct. Moreover, that 'social determinants of trust' are the transparent elements. This socially determined behaviour draws on notions of virtue ethics. If a caregiver (nurse or robot) consistently provides good, ethical care, then patients can trust that caregiver to do so often. The same social determinants may apply to care robots and thus it ought to be possible to trust them without the ability to see their thoughts. This study suggests why transparency mechanisms may not be effective in helping to develop trust in care robot ethical decision-making. It suggests that roboticists need to build sociable elements into care robots to help patients to develop patient trust in the care robot's ethical decision-making.
引用
收藏
页码:293 / 297
页数:5
相关论文