A cognitive brain model for multimodal sentiment analysis based on attention neural networks

被引:29
|
作者
Li, Yuanqing [1 ]
Zhang, Ke [1 ]
Wang, Jingyu [1 ]
Gao, Xinbo [2 ]
机构
[1] Northwestern Polytech Univ, Xian 710072, Shaanxi, Peoples R China
[2] Chongqing Univ Posts & Telecommun, Chongqing Key Lab Image Cognit, Chongqing 400065, Peoples R China
关键词
Multimodal sentiment analysis; Cognitive neuroscience; Hash algorithm; Attention-based BiLSTM; SPEECH; CLASSIFICATION; FUSION; TEXT; FRAMEWORK;
D O I
10.1016/j.neucom.2020.10.021
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multimodal sentiment analysis is one of the most attractive interdisciplinary research topics in artificial intelligence (AI). Different from other classification issues, multimodal sentiment analysis of human is a much finer classification problem. However, most current work accept all multimodalities as the input together and then output final results at one time after fusion and decision processes. Rare models try to divide their models into more than one fusion modules with different fusion strategies for better adaption of different tasks. Additionally, most recent multimodal sentiment analysis methods pay great focuses on binary classification, but the accuracy of multi-classification still remains difficult to improve. Inspired by the emotional processing procedure in cognitive science, both binary and multi-classification abilities are improved in our method by dividing the complicated problem into smaller issues which are easier to be handled. In this paper, we propose a Hierarchal Attention-BiLSTM (Bidirectional Long-Short Term Memory) model based on Cognitive Brain limbic system (HALCB). HALCB splits the multimodal sentiment analysis into two modules responsible for two tasks, the binary classification and the multiclassification. The former module divides the input items into two categories by recognizing their polarity and then sends them to the latter module separately. In this module, Hash algorithm is utilized to improve the retrieve accuracy and speed. Correspondingly, the latter module contains a positive subnet dedicated for positive inputs and a negative sub-nets dedicated for negative inputs. Each of these binary module and two sub-nets in multi-classification module possesses different fusion strategy and decision layer for matching its respective function. We also add a random forest at the final link to collect outputs from all modules and fuse them at the decision-level at last. Experiments are conducted on three datasets and compare the results with baselines on both binary classification and multi-classification tasks. Our experimental results surpass the state-of-the-art multimodal sentiment analysis methods on both binary and multi-classification by a big margin. (c) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页码:159 / 173
页数:15
相关论文
共 50 条
  • [1] Attention-based multimodal sentiment analysis and emotion recognition using deep neural networks
    Aslam, Ajwa
    Sargano, Allah Bux
    Habib, Zulfiqar
    [J]. APPLIED SOFT COMPUTING, 2023, 144
  • [2] A Weibo Sentiment Analysis Model Based on Attention Mechanisms and Deep Neural Networks
    Deng, Chao
    Chen, Yu
    [J]. Journal of Network Intelligence, 2024, 9 (03): : 1525 - 1536
  • [4] Multimodal Sentiment Analysis Using Deep Neural Networks
    Abburi, Harika
    Prasath, Rajendra
    Shrivastava, Manish
    Gangashetty, Suryakanth V.
    [J]. MINING INTELLIGENCE AND KNOWLEDGE EXPLORATION (MIKE 2016), 2017, 10089 : 58 - 65
  • [5] Multimodal sentiment analysis based on cross-instance graph neural networks
    Hongbin Wang
    Chun Ren
    Zhengtao Yu
    [J]. Applied Intelligence, 2024, 54 : 3403 - 3416
  • [6] Multimodal sentiment analysis based on cross-instance graph neural networks
    Wang, Hongbin
    Ren, Chun
    Yu, Zhengtao
    [J]. APPLIED INTELLIGENCE, 2024, 54 (04) : 3403 - 3416
  • [7] GATED MECHANISM FOR ATTENTION BASED MULTIMODAL SENTIMENT ANALYSIS
    Kumar, Ayush
    Vepa, Jithendra
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 4477 - 4481
  • [8] Multimodal Brain Image Analysis and Survival Prediction Using Neuromorphic Attention-Based Neural Networks
    Han, Il Song
    [J]. BRAINLESION: GLIOMA, MULTIPLE SCLEROSIS, STROKE AND TRAUMATIC BRAIN INJURIES (BRAINLES 2020), PT I, 2021, 12658 : 194 - 206
  • [9] A Pragmatic Approach to Emoji based Multimodal Sentiment Analysis using Deep Neural Networks
    Kumar, T. Praveen
    Vardhan, B. Vishnu
    [J]. JOURNAL OF ALGEBRAIC STATISTICS, 2022, 13 (01) : 473 - 482
  • [10] Multimodal Sentiment Analysis Based on Bidirectional Mask Attention Mechanism
    Zhang Y.
    Zhang H.
    Liu Y.
    Liang K.
    Wang Y.
    [J]. Data Analysis and Knowledge Discovery, 2023, 7 (04) : 46 - 55