Sentiment analysis method of consumer reviews based on multi-modal feature mining

被引:0
|
作者
You, Jing [1 ,2 ]
Zhong, Jiamin [3 ]
Kong, Jing [1 ]
Peng, Lihua [4 ]
机构
[1] School of Economics and Management, Guangdong Open University (Guangdong Polytechnic Institute), Guangdong, Guangzhou,510091, China
[2] Agricultural Industry and Digital Economy Research Center of Guangdong Open University (Guangdong Polytechnic Institute), Guangdong, Guangzhou,510091, China
[3] Zhongshan Xiaoji Technology Co., Ltd, Guangdong, Zhongshan,528403, China
[4] School of Law and Administration, Guangdong Open University (Guangdong Polytechnic Institute), Guangdong, Guangzhou,510091, China
基金
中国国家自然科学基金;
关键词
Data mining - Image coding - Signal encoding - Video analysis;
D O I
10.1016/j.ijcce.2024.12.001
中图分类号
学科分类号
摘要
Traditional sentiment analysis methods primarily rely on textual data. However, in real-world applications, product reviews often contain multimodal information including images, videos, and audio. This multimodal data is crucial for accurately understanding consumer sentiment trends. Therefore, this article proposes a novel product review sentiment analysis method that leverages multimodal feature mining. Firstly, a pre-trained vision transformer (VIT) and baidu general embedding (BGE) models are utilized to encode image and text features. Then, internal correlation mining is performed on image and text features through cross-attention. Next, the image and text features are aligned using a mask graph transformer network. The final encoding is obtained using multi-layer perceptron. Lastly, cosine similarity is computed between this encoding and each sentiment aspect of BGE encoding to determine corresponding sentiment scores. Simulations were conducted on a multi-modal Multi-ZOL dataset to compare the proposed method with several state-of-the-art techniques. The obtained results validated the superior performance of the proposed method. © 2024
引用
下载
收藏
页码:143 / 151
相关论文
共 50 条
  • [1] Multi-modal Sentiment Feature Learning Based on Sentiment Signal
    Lin, Dazhen
    Li, Lingxiao
    Cao, Donglin
    Li, Shaozi
    12TH CHINESE CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING (CHINESECSCW 2017), 2017, : 33 - 40
  • [2] Cognitive Hybrid Deep Learning-based Multi-modal Sentiment Analysis for Online Product Reviews
    Perti, Ashwin
    Sinha, Amit
    Vidyarthi, Ankit
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2024, 23 (08)
  • [3] Multi-Modal Sentiment Analysis Based on Interactive Attention Mechanism
    Wu, Jun
    Zhu, Tianliang
    Zheng, Xinli
    Wang, Chunzhi
    APPLIED SCIENCES-BASEL, 2022, 12 (16):
  • [4] Popularity Prediction of Social Media based on Multi-Modal Feature Mining
    Hsu, Chih-Chung
    Kang, Li-Wei
    Lee, Chia-Yen
    Lee, Jun-Yi
    Zhang, Zhong-Xuan
    Wu, Shao-Min
    PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA (MM'19), 2019, : 2687 - 2691
  • [5] BLR: A Multi-modal Sentiment Analysis Model
    Yang Yang
    Ye Zhonglin
    Zhao Haixing
    Li Gege
    Cao Shujuan
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PART X, 2023, 14263 : 466 - 478
  • [6] Multi-Modal Entity Alignment Method Based on Feature Enhancement
    Wang, Huansha
    Liu, Qinrang
    Huang, Ruiyang
    Zhang, Jianpeng
    APPLIED SCIENCES-BASEL, 2023, 13 (11):
  • [7] A Multi-Modal ELMo Model for Image Sentiment Recognition of Consumer Data
    Rong, Lu
    Ding, Yijie
    Wang, Mengyao
    El Saddik, Abdulmotaleb
    Hossain, M. Shamim
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2024, 70 (01) : 3697 - 3708
  • [8] Multi-modal fusion attention sentiment analysis for mixed sentiment classification
    Xue, Zhuanglin
    Xu, Jiabin
    COGNITIVE COMPUTATION AND SYSTEMS, 2024,
  • [9] Toward's Arabic Multi-modal Sentiment Analysis
    Alqarafi, Abdulrahman S.
    Adeel, Ahsan
    Gogate, Mandar
    Dashitpour, Kia
    Hussain, Amir
    Durrani, Tariq
    COMMUNICATIONS, SIGNAL PROCESSING, AND SYSTEMS, 2019, 463 : 2378 - 2386
  • [10] Contextual Inter-modal Attention for Multi-modal Sentiment Analysis
    Ghosal, Deepanway
    Akhtar, Md Shad
    Chauhan, Dushyant
    Poria, Soujanya
    Ekbalt, Asif
    Bhattacharyyat, Pushpak
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 3454 - 3466