Model of facial expressions management for an embodied conversational agent

被引:0
|
作者
Niewiadomski, Radoslaw [1 ]
Pelachaud, Catherine [1 ]
机构
[1] Univ Paris 08, IUT Monreuil, Paris, France
关键词
embodied conversational agents; social context; facial expressions;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper we present a model of facial behaviour encompassing interpersonal relations for an Embodied Conversational Agent (ECA). Although previous solutions of this problem exist in ECA's domain, in our approach a variety of facial expressions (i.e. expressed, masked, inhibited, and fake expressions) is used for the first time. Moreover, our rules of facial behaviour management are consistent with the predictions of politeness theory as well as the experimental data (i.e. annotation of the video-corpus). Knowing the affective state of the agent and the type of relations between interlocutors the system automatically adapts the facial behaviour of an agent to the social context. We present also the evaluation study we have conducted of our model. In this experiment we analysed the perception of interpersonal relations from the facial behaviour of our agent.
引用
收藏
页码:12 / +
页数:3
相关论文
共 50 条
  • [1] Representing Affective Facial Expressions for Robots and Embodied Conversational Agents by Facial Landmarks
    Caixia Liu
    Jaap Ham
    Eric Postma
    Cees Midden
    Bart Joosten
    Martijn Goudbeek
    [J]. International Journal of Social Robotics, 2013, 5 : 619 - 626
  • [2] Representing Affective Facial Expressions for Robots and Embodied Conversational Agents by Facial Landmarks
    Liu, Caixia
    Ham, Jaap
    Postma, Eric
    Midden, Cees
    Joosten, Bart
    Goudbeek, Martijn
    [J]. INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2013, 5 (04) : 619 - 626
  • [3] User preferences can drive facial expressions: evaluating an embodied conversational agent in a recommender dialogue system
    Foster, Mary Ellen
    Oberlander, Jon
    [J]. USER MODELING AND USER-ADAPTED INTERACTION, 2010, 20 (04) : 341 - 381
  • [4] User preferences can drive facial expressions: evaluating an embodied conversational agent in a recommender dialogue system
    Mary Ellen Foster
    Jon Oberlander
    [J]. User Modeling and User-Adapted Interaction, 2010, 20 : 341 - 381
  • [5] Asymmetric facial expressions: revealing richer emotions for embodied conversational agents
    Ahn, Junghyun
    Gobron, Stephane
    Thalmann, Daniel
    Boulic, Ronan
    [J]. COMPUTER ANIMATION AND VIRTUAL WORLDS, 2013, 24 (06) : 539 - 551
  • [6] FurChat: An Embodied Conversational Agent using LLMs, Combining Open and Closed-Domain Dialogue with Facial Expressions
    Cherakara, Neeraj
    Varghese, Finny
    Shabana, Sheena
    Nelson, Nivan
    Karukayil, Abhiram
    Kulothungan, Rohith
    Farhan, Mohammed Afil
    Nesset, Birthe
    Moujahid, Meriam
    Dinkar, Tanvi
    Rieser, Verena
    Lemon, Oliver
    [J]. 24TH MEETING OF THE SPECIAL INTEREST GROUP ON DISCOURSE AND DIALOGUE, SIGDIAL 2023, 2023, : 588 - 592
  • [7] Using embodied conversational agents in video games to investigate emotional facial expressions
    Lankes, Michael
    Bernhaupt, Regina
    [J]. ENTERTAINMENT COMPUTING, 2011, 2 (01) : 29 - 37
  • [8] Impression Detection and Management Using an Embodied Conversational Agent
    Wang, Chen
    Biancardi, Beatrice
    Mancini, Maurizio
    Cafaro, Angelo
    Pelachaud, Catherine
    Pun, Thierry
    Chanel, Guillaume
    [J]. HUMAN-COMPUTER INTERACTION. MULTIMODAL AND NATURAL INTERACTION, HCI 2020, PT II, 2020, 12182 : 260 - 278
  • [9] Pedagogical embodied conversational agent
    Doswell, JT
    [J]. IEEE INTERNATIONAL CONFERENCE ON ADVANCED LEARNING TECHNOLOGIES, PROCEEDINGS, 2004, : 774 - 776
  • [10] CATE: An Embodied Conversational Agent for the Elderly
    Bravo, Sean Latrelle
    Herrera, Cedric Jose
    Valdez, Edward Carlo
    Poliquit, Klint John
    Ureta, Jennifer
    Cu, Jocelynn
    Azcarraga, Judith
    Rivera, Joanna Pauline
    [J]. ICAART: PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE, VOL 2, 2020, : 941 - 948