Gender-Driven Emotion Recognition Through Speech Signals for Ambient Intelligence Applications

被引:57
|
作者
Bisio, Igor [1 ]
Delfino, Alessandro [1 ]
Lavagetto, Fabio [1 ]
Marchese, Mario [1 ]
Sciarrone, Andrea [1 ]
机构
[1] Univ Genoa, Dept Elect Elect Telecommun Engn & Naval Architec, I-16145 Genoa, Italy
关键词
Human-computer intelligent interaction; gender recognition; emotion recognition; pitch estimation; support vector machine; FEATURES; AUDIO;
D O I
10.1109/TETC.2013.2274797
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper proposes a system that allows recognizing a person's emotional state starting from audio signal registrations. The provided solution is aimed at improving the interaction among humans and computers, thus allowing effective human-computer intelligent interaction. The system is able to recognize six emotions (anger, boredom, disgust, fear, happiness, and sadness) and the neutral state. This set of emotional states is widely used for emotion recognition purposes. It also distinguishes a single emotion versus all the other possible ones, as proven in the proposed numerical results. The system is composed of two subsystems: 1) gender recognition (GR) and 2) emotion recognition (ER). The experimental analysis shows the performance in terms of accuracy of the proposed ER system. The results highlight that the a priori knowledge of the speaker's gender allows a performance increase. The obtained results show also that the features selection adoption assures a satisfying recognition rate and allows reducing the employed features. Future developments of the proposed solution may include the implementation of this system over mobile devices such as smartphones.
引用
收藏
页码:244 / 257
页数:14
相关论文
共 50 条
  • [1] Gender-Driven English Speech Emotion Recognition with Genetic Algorithm
    Yue, Liya
    Hu, Pei
    Zhu, Jiulong
    [J]. BIOMIMETICS, 2024, 9 (06)
  • [2] Gender Specific Emotion Recognition Through Speech Signals
    Vinay
    Gupta, Shilpi
    Mehra, Anu
    [J]. 2014 INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND INTEGRATED NETWORKS (SPIN), 2014, : 727 - 733
  • [3] Use of Multiple Classifier System for Gender Driven Speech Emotion Recognition
    Ladde, Pravina P.
    Deshmukh, Vaishali S.
    [J]. 2015 INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND COMMUNICATION NETWORKS (CICN), 2015, : 713 - 717
  • [4] Emotion recognition from Madarin speech signals
    Pao, TL
    Chen, YT
    Yeh, JH
    [J]. 2004 INTERNATIONAL SYMPOSIUM ON CHINESE SPOKEN LANGUAGE PROCESSING, PROCEEDINGS, 2004, : 301 - 304
  • [5] Techniques and Applications of Emotion Recognition in Speech
    Lugovic, S.
    Dunder, I.
    Horvat, M.
    [J]. 2016 39TH INTERNATIONAL CONVENTION ON INFORMATION AND COMMUNICATION TECHNOLOGY, ELECTRONICS AND MICROELECTRONICS (MIPRO), 2016, : 1278 - 1283
  • [6] Informative Speech Features based on Emotion Classes and Gender in Explainable Speech Emotion Recognition
    Yildirim, Huseyin Ediz
    Iren, Deniz
    [J]. 2023 11TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION WORKSHOPS AND DEMOS, ACIIW, 2023,
  • [7] Multimodal emotion recognition for the fusion of speech and EEG signals
    Ma, Jianghe
    Sun, Ying
    Zhang, Xueying
    [J]. Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2019, 46 (01): : 143 - 150
  • [8] Emotion recognition and evaluation from Mandarin speech signals
    Pao, Tsanglong
    Chen, Yute
    Yeh, Junheng
    [J]. INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 2008, 4 (07): : 1695 - 1709
  • [9] Improving Automatic Emotion Recognition from Speech Signals
    Bozkurt, Elif
    Erzin, Engin
    Erdem, Cigdem Eroglu
    Erdem, A. Tanju
    [J]. INTERSPEECH 2009: 10TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2009, VOLS 1-5, 2009, : 312 - +
  • [10] Emotion Recognition On Speech Signals Using Machine Learning
    Ghai, Mohan
    Lal, Shamit
    Duggal, Shivam
    Manik, Shrey
    [J]. PROCEEDINGS OF THE 2017 INTERNATIONAL CONFERENCE ON BIG DATA ANALYTICS AND COMPUTATIONAL INTELLIGENCE (ICBDAC), 2017, : 34 - 39