Robots as Moral Advisors: The Effects of Deontological, Virtue, and Confucian Role Ethics on Encouraging Honest Behavior

被引:23
|
作者
Kim, Boyoung [1 ]
Wen, Ruchen [2 ]
Zhu, Qin [2 ]
Williams, Tom [2 ]
Phillips, Elizabeth [3 ]
机构
[1] US Air Force Acad, Colorado Springs, CO 80840 USA
[2] Colorado Sch Mines, Golden, CO 80401 USA
[3] George Mason Univ, Fairfax, VA 22030 USA
关键词
Human-Robot Interaction; Moral Communication; Deontological Ethics; Virtue Ethics; Confucian Role Ethics; Cultural Orientations;
D O I
10.1145/3434074.3446908
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We examined how robots can successfully serve as moral advisors for humans. We evaluated the effectiveness of moral advice grounded in deontological, virtue, and Confucian role ethics frameworks in encouraging humans to make honest decisions. Participants were introduced to a tempting situation where extra monetary gain could be earned by choosing to cheat (i.e., violating the norm of honesty). Prior to their decision, a robot encouraged honest choices by offering a piece of moral advice grounded in one of the three ethics frameworks. While the robot's advice was overall not effective at discouraging dishonest choices, there was preliminary evidence indicating the relative effectiveness of moral advice drawn from deontology. We also explored how different cultural orientations (i.e., vertical and horizontal collectivism and individualism) influence honest decisions across differentially-framed moral advice. We found that individuals with a strong cultural orientation of establishing their own power and status through competition (i.e., high vertical individualism) were more likely to make dishonest choices, especially when moral advice was drawn from virtue ethics. Our findings suggest the importance of considering different ethical frameworks and cultural differences to design robots that can guide humans to comply with the norm of honesty.
引用
收藏
页码:10 / 18
页数:9
相关论文
共 18 条