Detection of Affective States From Text and Speech for Real-Time Human-Computer Interaction

被引:11
|
作者
Calix, Ricardo A. [1 ]
Javadpour, Leili [1 ]
Knapp, Gerald M. [1 ]
机构
[1] Louisiana State Univ, Baton Rouge, LA 70803 USA
关键词
knowledge representation; cognitive processes; language; human-computer interaction;
D O I
10.1177/0018720811425922
中图分类号
B84 [心理学]; C [社会科学总论]; Q98 [人类学];
学科分类号
03 ; 0303 ; 030303 ; 04 ; 0402 ;
摘要
Objective: The goal of this work is to develop and test an automated system methodology that can detect emotion from text and speech features. Background: Affective human-computer interaction will be critical for the success of new systems that will be prevalent in the 21st century. Such systems will need to properly deduce human emotional state before they can determine how to best interact with people. Method: Corpora and machine learning classification models are used to train and test a methodology for emotion detection. The methodology uses a step-wise approach to detect sentiment in sentences by first filtering out neutral sentences, then distinguishing among positive, negative, and five emotion classes. Results: Results of the classification between emotion and neutral sentences achieved recall accuracies as high as 77% in the University of Illinois at Urbana-Champaign (UIUC) corpus and 61% in the Louisiana State University medical drama (LSU-MD) corpus for emotion samples. Once neutral sentences were filtered out, the methodology achieved accuracy scores for detecting negative sentences as high as 92.3%. Conclusion: Results of the feature analysis indicate that speech spectral features are better than speech prosodic features for emotion detection. Accumulated sentiment composition text features appear to be very important as well. This work contributes to the study of human communication by providing a better understanding of how language factors help to best convey human emotion and how to best automate this process. Application: Results of this study can be used to develop better automated assistive systems that interpret human language and respond to emotions through 3-D computer graphics.
引用
收藏
页码:530 / 545
页数:16
相关论文
共 50 条
  • [1] A Novel Real-Time Eye Detection in Human-Computer Interaction
    Yan Chao
    Wang Yuanqing
    Zhang Zhaoyang
    [J]. INNOVATIVE COMPUTING AND INFORMATION, PT II, 2011, 232 : 530 - +
  • [2] A Novel Real-Time Eye Detection in Human-Computer Interaction
    Yan, Chao
    Wang, Yuanqing
    Zhang, Zhaoyang
    [J]. 2010 SECOND INTERNATIONAL CONFERENCE ON E-LEARNING, E-BUSINESS, ENTERPRISE INFORMATION SYSTEMS, AND E-GOVERNMENT (EEEE 2010), VOL I, 2010, : 57 - 62
  • [3] Real-Time Human-Computer Interaction Using Eye Gazes
    Chen, Haodong
    Zendehdel, Niloofar
    Leu, Ming C.
    Yin, Zhaozheng
    [J]. MANUFACTURING LETTERS, 2023, 35 : 883 - 894
  • [4] Real-Time Human-Computer Interaction Using Eye Gazes
    Chen, Haodong
    Zendehdel, Niloofar
    Leu, Ming C.
    Yin, Zhaozheng
    [J]. MANUFACTURING LETTERS, 2023, 35 : 883 - 894
  • [5] Expressive Gibberish Speech Synthesis for Affective Human-Computer Interaction
    Yilmazyildiz, Selma
    Latacz, Lukas
    Mattheyses, Wesley
    Verhelst, Werner
    [J]. TEXT, SPEECH AND DIALOGUE, 2010, 6231 : 584 - 590
  • [6] Real-Time Continuous Gesture Recognition for Natural Human-Computer Interaction
    Yin, Ying
    Davis, Randall
    [J]. 2014 IEEE SYMPOSIUM ON VISUAL LANGUAGES AND HUMAN-CENTRIC COMPUTING (VL/HCC 2014), 2014, : 113 - 120
  • [7] Real-time visual recognition of facial gestures for human-computer interaction
    Zelinsky, A
    Heinzmann, J
    [J]. PROCEEDINGS OF THE SECOND INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION, 1996, : 351 - 356
  • [8] HeadTrack: Real-Time Human-Computer Interaction via Wireless Earphones
    Hu, Jingyang
    Jiang, Hongbo
    Xiao, Zhu
    Chen, Siyu
    Dustdar, Schahram
    Liu, Jiangchuan
    [J]. IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2024, 42 (04) : 990 - 1002
  • [9] Cross-Topic Opinion Mining for Real-Time Human-Computer Interaction
    Balahur, Alexandra
    Boldrini, Ester
    Montoyo, Andres
    Martinez-Barco, Patricio
    [J]. NATURAL LANGUAGE PROCESSING AND COGNITIVE SCIENCE, PROCEEDINGS, 2009, : 13 - 22
  • [10] A Dynamic Head Gesture Recognition Method for Real-Time Human-Computer Interaction
    Xie, Jialong
    Zhang, Botao
    Chepinskiy, Sergey A.
    Zhilenkov, Anton A.
    [J]. INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2021, PT III, 2021, 13015 : 235 - 245