Understanding Nonverbal Communication Cues of Human Personality Traits in Human-Robot Interaction

被引:0
|
作者
Zhihao Shen [1 ]
Armagan Elibol [1 ]
Nak Young Chong [2 ,1 ]
机构
[1] the Japan Advanced Institute of Science and Technology
[2] IEEE
基金
欧盟地平线“2020”;
关键词
D O I
暂无
中图分类号
TP242 [机器人];
学科分类号
1111 ;
摘要
With the increasing presence of robots in our daily life, there is a strong need and demand for the strategies to acquire a high quality interaction between robots and users by enabling robots to understand users’ mood, intention, and other aspects. During human-human interaction, personality traits have an important influence on human behavior, decision, mood, and many others. Therefore, we propose an efficient computational framework to endow the robot with the capability of understanding the user’s personality traits based on the user’s nonverbal communication cues represented by three visual features including the head motion, gaze, and body motion energy, and three vocal features including voice pitch, voice energy, and mel-frequency cepstral coefficient(MFCC). We used the Pepper robot in this study as a communication robot to interact with each participant by asking questions, and meanwhile, the robot extracts the nonverbal features from each participant’s habitual behavior using its on-board sensors. On the other hand, each participant’s personality traits are evaluated with a questionnaire. We then train the ridge regression and linear support vector machine(SVM) classifiers using the nonverbal features and personality trait labels from a questionnaire and evaluate the performance of the classifiers. We have verified the validity of the proposed models that showed promising binary classification performance on recognizing each of the Big Five personality traits of the participants based on individual differences in nonverbal communication cues.
引用
下载
收藏
页码:1465 / 1477
页数:13
相关论文
共 50 条
  • [31] Understanding Human-Robot Interaction in Virtual Reality
    Liu, Oliver
    Rakita, Daniel
    Mutlu, Bilge
    Gleicher, Michael
    2017 26TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2017, : 751 - 757
  • [32] Semantic Scene Understanding for Human-Robot Interaction
    Patel, Maithili
    Dogan, Fethiye Irmak
    Zeng, Zhen
    Baraka, Kim
    Chernova, Sonia
    COMPANION OF THE ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2023, 2023, : 941 - 943
  • [33] The influence of subjects' personality traits on personal spatial zones in a human-robot interaction experiment
    Walters, ML
    Dautenhahn, K
    Boekhorst, RT
    Koay, KL
    Kaouri, C
    Woods, S
    Nehaniv, C
    Lee, D
    Werry, I
    2005 IEEE INTERNATIONAL WORKSHOP ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2005, : 347 - 352
  • [34] Toward understanding social cues and signals in human-robot interaction: effects of robot gaze and proxemic behavior
    Fiore, Stephen M.
    Wiltshire, Travis J.
    Lobato, Emilio J. C.
    Jentsch, Florian G.
    Huang, Wesley H.
    Axelrod, Benjamin
    FRONTIERS IN PSYCHOLOGY, 2013, 4
  • [35] Paralinguistic Cues in Speech to Adapt Robot Behavior in Human-Robot Interaction
    Ashok, Ashita
    Pawlak, Jakub
    Paplu, Sarwar
    Zafar, Zuhair
    Berns, Karsten
    2022 9TH IEEE RAS/EMBS INTERNATIONAL CONFERENCE ON BIOMEDICAL ROBOTICS AND BIOMECHATRONICS (BIOROB 2022), 2022,
  • [36] Are robots like people? Relationships between participant and robot personality traits in human-robot interaction studies
    Woods, Sarah
    Dautenhahn, Kerstin
    Kaouri, Christina
    te Boekhorst, Ren
    Koay, Kheng Lee
    Walters, Michael L.
    INTERACTION STUDIES, 2007, 8 (02) : 281 - 305
  • [37] Covert Robot-Robot Communication: Human Perceptions and Implications for Human-Robot Interaction
    Williams, Tom
    Briggs, Priscilla
    Scheutz, Matthias
    JOURNAL OF HUMAN-ROBOT INTERACTION, 2015, 4 (02): : 24 - 49
  • [38] Action Alignment from Gaze Cues in Human-Human and Human-Robot Interaction
    Duarte, Nuno Ferreira
    Rakovic, Mirko
    Marques, Jorge
    Santos-Victor, Jose
    COMPUTER VISION - ECCV 2018 WORKSHOPS, PT III, 2019, 11131 : 197 - 212
  • [39] Communication Models in Human-Robot Interaction: An Asymmetric MODel of ALterity in Human-Robot Interaction (AMODAL-HRI)
    Frijns, Helena Anna
    Schuerer, Oliver
    Koeszegi, Sabine Theresia
    INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2023, 15 (03) : 473 - 500
  • [40] Understanding the Perception of Incremental Robot Response in Human-Robot Interaction
    Jensen, Lars Christian
    Langedijk, Rosalyn Melissa
    Fischer, Kerstin
    2020 29TH IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2020, : 41 - 47