Deriving Minimal Features for Human-Like Facial Expressions in Robotic Faces

被引:0
|
作者
Casey C. Bennett
Selma Šabanović
机构
[1] Indiana University,School of Informatics and Computing
[2] Centerstone Research Institute,Department of Informatics
[3] Centerstone Research Institute,Department of Informatics
关键词
Human–robot interaction; Facial expression; Emotion; Minimalist design; Robot design;
D O I
暂无
中图分类号
学科分类号
摘要
This study explores deriving minimal features for a robotic face to convey information (via facial expressions) that people can perceive and understand. Recent research in computer vision has shown that a small number of moving points/lines can be used to capture the majority of information (∼\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\sim $$\end{document}95 %) in human facial expressions. Here, we apply such findings to a minimalist robot face design, which was run through a series of experiments with human subjects (n = 75) exploring the effect of various factors, including added neck motion and degree of expression. Facial expression identification rates were similar to more complex robots. In addition, added neck motion significantly improved facial expression identification rates to 100 % for all expressions (except Fear). The Negative Attitudes towards Robots (NARS) and Godspeed scales were also collected to examine user perceptions, e.g. perceived animacy and intelligence. The project aims to answer a number of fundamental questions about robotic face design, as well as to develop inexpensive and replicable robotic faces for experimental purposes.
引用
收藏
页码:367 / 381
页数:14
相关论文
共 50 条
  • [31] Human-Like Bilateral Robotic Arm Controls for Remote Pipe Maintenance
    Zhou, Tianyu
    Zhu, Qi
    Du, Jing
    COMPUTING IN CIVIL ENGINEERING 2021, 2022, : 1050 - 1058
  • [32] Synergistic control of soft robotic hands for human-like grasp postures
    Zhang, NingBin
    Zhao, Yi
    Gu, GuoYing
    Zhu, XiangYang
    SCIENCE CHINA-TECHNOLOGICAL SCIENCES, 2022, 65 (03) : 553 - 568
  • [33] Towards a human-like movements generator based on environmental features
    Zonta, A.
    Smit, S. K.
    Eiben, A. E.
    2020 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2020, : 413 - 420
  • [34] Autistic traits and sensitivity to human-like features of robot behavior
    Wykowska, Agnieszka
    Kajopoulos, Jasmin
    Ramirez-Amaro, Karinne
    Cheng, Gordon
    INTERACTION STUDIES, 2015, 16 (02) : 219 - 248
  • [35] Criterion for human arm in reaching tasks and human-like motion planning of robotic arm
    Zhao, Jing
    Guo, Xingwei
    Xie, Biyun
    Jixie Gongcheng Xuebao/Journal of Mechanical Engineering, 2015, 51 (23): : 21 - 27
  • [36] Human-like motion planning of robotic arms based on human arm motion patterns
    Zhao, Jing
    Wang, Chengyun
    Xie, Biyun
    ROBOTICA, 2023, 41 (01) : 259 - 276
  • [37] Human-like robot head that has olfactory sensation and facial color expression
    Miwa, H
    Umetsu, T
    Takanishi, A
    Takanobu, H
    2001 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS I-IV, PROCEEDINGS, 2001, : 459 - 464
  • [38] Lightweight Human-Like Robotic Leg with Four-Bar Mechanism Joints
    Sivak, Oleksandr
    Mianowski, Krzysztof
    Vonwirth, Patrick
    Berns, Karsten
    WALKING ROBOTS INTO REAL WORLD, CLAWAR 2024 CONFERENCE, VOL 1, 2024, 1114 : 255 - 265
  • [39] Index Finger of a Human-Like Robotic Hand Using Thin Soft Muscles
    Faudzi, Ahmad Athif Mohd
    Ooga, Junichiro
    Goto, Tatsuhiko
    Takeichi, Masashi
    Suzumori, Koichi
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2018, 3 (01): : 92 - 99
  • [40] Generating robotic elliptical excisions with human-like tool-tissue interactions
    Straizys, Arturas
    Burke, Michael
    Ramamoorthy, Subramanian
    2024 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2024), 2024, : 15017 - 15023