Understanding Nonverbal Communication Cues of Human Personality Traits in Human-Robot Interaction

被引:0
|
作者
Zhihao Shen [1 ]
Armagan Elibol [1 ]
Nak Young Chong [2 ,1 ]
机构
[1] the Japan Advanced Institute of Science and Technology
[2] IEEE
基金
欧盟地平线“2020”;
关键词
D O I
暂无
中图分类号
TP242 [机器人];
学科分类号
1111 ;
摘要
With the increasing presence of robots in our daily life, there is a strong need and demand for the strategies to acquire a high quality interaction between robots and users by enabling robots to understand users’ mood, intention, and other aspects. During human-human interaction, personality traits have an important influence on human behavior, decision, mood, and many others. Therefore, we propose an efficient computational framework to endow the robot with the capability of understanding the user’s personality traits based on the user’s nonverbal communication cues represented by three visual features including the head motion, gaze, and body motion energy, and three vocal features including voice pitch, voice energy, and mel-frequency cepstral coefficient(MFCC). We used the Pepper robot in this study as a communication robot to interact with each participant by asking questions, and meanwhile, the robot extracts the nonverbal features from each participant’s habitual behavior using its on-board sensors. On the other hand, each participant’s personality traits are evaluated with a questionnaire. We then train the ridge regression and linear support vector machine(SVM) classifiers using the nonverbal features and personality trait labels from a questionnaire and evaluate the performance of the classifiers. We have verified the validity of the proposed models that showed promising binary classification performance on recognizing each of the Big Five personality traits of the participants based on individual differences in nonverbal communication cues.
引用
下载
收藏
页码:1465 / 1477
页数:13
相关论文
共 50 条
  • [1] Understanding nonverbal communication cues of human personality traits in human-robot interaction
    Shen, Zhihao
    Elibol, Armagan
    Chong, Nak Young
    IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2020, 7 (06) : 1465 - 1477
  • [2] Nonverbal Cues in Human-Robot Interaction: A Communication Studies Perspective
    Urakami, Jacqueline
    Seaborn, Katie
    ACM TRANSACTIONS ON HUMAN-ROBOT INTERACTION, 2023, 12 (02)
  • [3] Nonverbal Behavior Cue for Recognizing Human Personality Traits in Human-Robot Social Interaction
    Shen, Zhihao
    Elibol, Armagan
    Chong, Nak Young
    2019 IEEE 4TH INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS AND MECHATRONICS (ICARM 2019), 2019, : 402 - 407
  • [4] Evaluating the Effectiveness of Nonverbal Communication in Human-Robot Interaction
    Ali, Waqar
    Williams, Andrew B.
    HRI'20: COMPANION OF THE 2020 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2020, : 99 - 100
  • [5] Inferring Human Personality Traits in Human-Robot Social Interaction
    Shen, Zhihao
    Elibol, Armagan
    Chong, Nak Young
    HRI '19: 2019 14TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2019, : 578 - 579
  • [6] A Model for Synthesizing a Combined Verbal and NonVerbal Behavior Based on Personality Traits in Human-Robot Interaction
    Aly, Amir
    Tapus, Adriana
    PROCEEDINGS OF THE 8TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI 2013), 2013, : 325 - 332
  • [7] Influence of Personality Traits on Helping Behaviour in Human-Robot Interaction
    Westhoven, Martin
    Tegtmeier, Patricia
    2021 IEEE INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS AND ITS SOCIAL IMPACTS (ARSO), 2021, : 78 - 84
  • [8] Multi-modal feature fusion for better understanding of human personality traits in social human-robot interaction
    Shen, Zhihao
    Elibol, Armagan
    Chong, Nak Young
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2021, 146
  • [9] Nonverbal intimacy as a benchmark for human-robot interaction
    Lee, Billy
    INTERACTION STUDIES, 2007, 8 (03) : 411 - 422
  • [10] Communication in Human-Robot Interaction
    Andrea Bonarini
    Current Robotics Reports, 2020, 1 (4): : 279 - 285