Understanding Nonverbal Communication Cues of Human Personality Traits in Human-Robot Interaction

被引:0
|
作者
Zhihao Shen [1 ]
Armagan Elibol [1 ]
Nak Young Chong [2 ,1 ]
机构
[1] the Japan Advanced Institute of Science and Technology
[2] IEEE
基金
欧盟地平线“2020”;
关键词
D O I
暂无
中图分类号
TP242 [机器人];
学科分类号
1111 ;
摘要
With the increasing presence of robots in our daily life, there is a strong need and demand for the strategies to acquire a high quality interaction between robots and users by enabling robots to understand users’ mood, intention, and other aspects. During human-human interaction, personality traits have an important influence on human behavior, decision, mood, and many others. Therefore, we propose an efficient computational framework to endow the robot with the capability of understanding the user’s personality traits based on the user’s nonverbal communication cues represented by three visual features including the head motion, gaze, and body motion energy, and three vocal features including voice pitch, voice energy, and mel-frequency cepstral coefficient(MFCC). We used the Pepper robot in this study as a communication robot to interact with each participant by asking questions, and meanwhile, the robot extracts the nonverbal features from each participant’s habitual behavior using its on-board sensors. On the other hand, each participant’s personality traits are evaluated with a questionnaire. We then train the ridge regression and linear support vector machine(SVM) classifiers using the nonverbal features and personality trait labels from a questionnaire and evaluate the performance of the classifiers. We have verified the validity of the proposed models that showed promising binary classification performance on recognizing each of the Big Five personality traits of the participants based on individual differences in nonverbal communication cues.
引用
下载
收藏
页码:1465 / 1477
页数:13
相关论文
共 50 条
  • [41] Haptic Interaction for Human-Robot Communication Using a Spherical Robot
    Tan, Xiang Zhi
    Steinfeld, Aaron
    2018 27TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (IEEE RO-MAN 2018), 2018, : 732 - 739
  • [42] Trust, Acceptance and Social Cues in Human-Robot Interaction (SCRITA)
    Rossi, Alessandra
    Holthaus, Patrick
    Moros, Silvia
    Lakatos, Gabriella
    Andriella, Antonio
    Scheunemann, Marcus
    Van Maris, Anouk
    INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2024, 16 (06) : 1047 - 1048
  • [43] Trust, Acceptance and Social Cues in Human-Robot Interaction (SCRITA)
    Rossi, Alessandra
    Holthaus, Patrick
    Perugia, Giulia
    Moros, Silvia
    Marcus Scheunemann
    INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2021, 13 (08) : 1833 - 1834
  • [44] Designing a Personality-Driven Robot for a Human-Robot Interaction Scenario
    Mohammadi, Hadi Beik
    Xirakia, Nikoletta
    Abawi, Fares
    Barykina, Irina
    Chandran, Krishnan
    Nair, Gitanjali
    Cuong Nguyen
    Speck, Daniel
    Alpay, Tayfun
    Griffiths, Sascha
    Heinrich, Stefan
    Strahl, Erik
    Weber, Cornelius
    Wermter, Stefan
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 4317 - 4324
  • [45] Deep Learning-based Head Pose Estimation for Enhancing Nonverbal Communication in Human-Robot Interaction
    Yoon, Chan Young
    Lim, Yoongu
    Lee, DongWook
    Ko, KwangEun
    2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN, 2023, : 908 - 913
  • [46] The Influence of Individual Social Traits on Robot Learning in a Human-Robot Interaction
    Guedjou, Hakim
    Boucenna, Sofiane
    Xavier, Jean
    Cohen, David
    Chetouani, Mohamed
    2017 26TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2017, : 256 - 262
  • [47] Human-Robot Interaction
    Jia, Yunyi
    Zhang, Biao
    Li, Miao
    King, Brady
    Meghdari, Ali
    JOURNAL OF ROBOTICS, 2018, 2018
  • [48] Advbot: Towards Understanding Human Preference in a Human-Robot Interaction Scenario
    Wong, Clarice J.
    Tay, Yong Ling
    Lew, Lincoln W. C.
    Koh, Hui Fang
    Xiong, Yijing
    Wu, Yan
    2018 15TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION (ICARCV), 2018, : 1305 - 1309
  • [49] Human-robot interaction-oriented video understanding of human actions
    Wang, Bin
    Chang, Faliang
    Liu, Chunsheng
    Wang, Wenqian
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 133
  • [50] Human-Robot Interaction by Understanding Upper Body Gestures
    Xiao, Yang
    Zhang, Zhijun
    Beck, Aryel
    Yuan, Junsong
    Thalmann, Daniel
    PRESENCE-VIRTUAL AND AUGMENTED REALITY, 2014, 23 (02): : 133 - 154