Multimodal Fusion Method Based on Self-Attention Mechanism

被引:10
|
作者
Zhu, Hu [1 ]
Wang, Ze [2 ]
Shi, Yu [3 ]
Hua, Yingying [1 ]
Xu, Guoxia [4 ]
Deng, Lizhen [5 ]
机构
[1] Nanjing Univ Posts & Telecommun, Jiangsu Prov Key Lab Image Proc & Image Commun, Nanjing 210003, Peoples R China
[2] China Acad Launch Vehicle Technol, R&D Ctr, Beijing 100176, Peoples R China
[3] Nanjing Univ Posts & Telecommun, Bell Honors Sch, Nanjing 210003, Peoples R China
[4] Norwegian Univ Sci & Technol, Dept Comp Sci, N-2815 Gjovik, Norway
[5] Nanjing Univ Posts & Telecommun, Natl Engn Res Ctr Commun & Network Technol, Nanjing 210003, Peoples R China
基金
中国国家自然科学基金;
关键词
Computational complexity - Data fusion - Computational efficiency;
D O I
10.1155/2020/8843186
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multimodal fusion is one of the popular research directions of multimodal research, and it is also an emerging research field of artificial intelligence. Multimodal fusion is aimed at taking advantage of the complementarity of heterogeneous data and providing reliable classification for the model. Multimodal data fusion is to transform data from multiple single-mode representations to a compact multimodal representation. In previous multimodal data fusion studies, most of the research in this field used multimodal representations of tensors. As the input is converted into a tensor, the dimensions and computational complexity increase exponentially. In this paper, we propose a low-rank tensor multimodal fusion method with an attention mechanism, which improves efficiency and reduces computational complexity. We evaluate our model through three multimodal fusion tasks, which are based on a public data set: CMU-MOSI, IEMOCAP, and POM. Our model achieves a good performance while flexibly capturing the global and local connections. Compared with other multimodal fusions represented by tensors, experiments show that our model can achieve better results steadily under a series of attention mechanisms.
引用
下载
收藏
页数:8
相关论文
共 50 条
  • [31] Dual-branch counting method for dense crowd based on self-attention mechanism
    Wang, Yongjie
    Wang, Feng
    Huang, Dongyang
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 236
  • [32] Script event prediction method based on self-attention mechanism and graph representation learning
    Hu, Meng
    Bai, Lu
    Yang, Mei
    2022 IEEE 6TH ADVANCED INFORMATION TECHNOLOGY, ELECTRONIC AND AUTOMATION CONTROL CONFERENCE (IAEAC), 2022, : 722 - 726
  • [33] Solving method of traveling salesman problem based on performer graph self-attention mechanism
    Han Li
    Duan Qianqian
    Signal, Image and Video Processing, 2025, 19 (2)
  • [34] A finger vein authentication method based on the lightweight Siamese network with the self-attention mechanism
    Fang, Chunxin
    Ma, Hui
    Li, Jianian
    INFRARED PHYSICS & TECHNOLOGY, 2023, 128
  • [35] A Terrain Elevation Map Generation Method Based on Self-Attention Mechanism and Multifeature Sketch
    Cai, Xingquan
    Xi, Mengyao
    Yu, Nu
    Yang, Zhe
    Sun, Haiyan
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [36] A SYN Flood Attack Detection Method Based on Hierarchical Multihead Self-Attention Mechanism
    Guo, Xiaojun
    Gao, Xuan
    SECURITY AND COMMUNICATION NETWORKS, 2022, 2022
  • [37] Automatic segmentation of golden pomfret based on fusion of multi-head self-attention and channel-attention mechanism
    Yu, Guoyan
    Luo, Yingtong
    Deng, Ruoling
    COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2022, 202
  • [38] Wide Self-attention Mechanism Fusion Dense Residual Network Image Dehazing
    Wu K.
    Ding Y.
    Hunan Daxue Xuebao/Journal of Hunan University Natural Sciences, 2023, 50 (08): : 13 - 22
  • [39] Underwater image imbalance attenuation compensation based on attention and self-attention mechanism
    Wang, Danxu
    Wei, Yanhui
    Liu, Junnan
    Ouyang, Wenjia
    Zhou, Xilin
    2022 OCEANS HAMPTON ROADS, 2022,
  • [40] Point cloud classification network based on self-attention mechanism
    Li, Yujie
    Cai, Jintong
    COMPUTERS & ELECTRICAL ENGINEERING, 2022, 104