Enhancing Multimodal Patterns in Neuroimaging by Siamese Neural Networks with Self-Attention Mechanism

被引:13
|
作者
Arco, Juan E. [1 ,2 ,3 ]
Ortiz, Andres [2 ,3 ]
Gallego-Molina, Nicolas J. [2 ,3 ]
Gorriz, Juan M. [1 ,3 ]
Ramirez, Javier [1 ,3 ]
机构
[1] Univ Granada, Dept Signal Theory Networking & Commun, Granada 18010, Spain
[2] Univ Malaga, Dept Commun Engn, Malaga 29010, Spain
[3] Andalusian Res Inst Data Sci & Computat Intellige, Granada, Spain
关键词
Multimodal combination; siamese neural network; self-attention; deep learning; medical imaging; ALZHEIMERS-DISEASE; FUNCTIONAL CONNECTIVITY; MATTER LOSS; DIAGNOSIS; FUSION; MULTISCALE; MRI;
D O I
10.1142/S0129065723500193
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The combination of different sources of information is currently one of the most relevant aspects in the diagnostic process of several diseases. In the field of neurological disorders, different imaging modalities providing structural and functional information are frequently available. Those modalities are usually analyzed separately, although a joint of the features extracted from both sources can improve the classification performance of Computer-Aided Diagnosis (CAD) tools. Previous studies have computed independent models from each individual modality and combined them in a subsequent stage, which is not an optimum solution. In this work, we propose a method based on the principles of siamese neural networks to fuse information from Magnetic Resonance Imaging (MRI) and Positron Emission Tomography (PET). This framework quantifies the similarities between both modalities and relates them with the diagnostic label during the training process. The resulting latent space at the output of this network is then entered into an attention module in order to evaluate the relevance of each brain region at different stages of the development of Alzheimer's disease. The excellent results obtained and the high flexibility of the method proposed allow fusing more than two modalities, leading to a scalable methodology that can be used in a wide range of contexts.
引用
收藏
页数:18
相关论文
共 50 条
  • [31] Multiple Protein Subcellular Locations Prediction Based on Deep Convolutional Neural Networks with Self-Attention Mechanism
    Cong, Hanhan
    Liu, Hong
    Cao, Yi
    Chen, Yuehui
    Liang, Cheng
    INTERDISCIPLINARY SCIENCES-COMPUTATIONAL LIFE SCIENCES, 2022, 14 (02) : 421 - 438
  • [32] Multiple Protein Subcellular Locations Prediction Based on Deep Convolutional Neural Networks with Self-Attention Mechanism
    Hanhan Cong
    Hong Liu
    Yi Cao
    Yuehui Chen
    Cheng Liang
    Interdisciplinary Sciences: Computational Life Sciences, 2022, 14 : 421 - 438
  • [33] Probabilistic Matrix Factorization Recommendation of Self-Attention Mechanism Convolutional Neural Networks With Item Auxiliary Information
    Zhang, Chenkun
    Wang, Cheng
    IEEE ACCESS, 2020, 8 (08): : 208311 - 208321
  • [34] Lipschitz Normalization for Self-Attention Layers with Application to Graph Neural Networks
    Dasoulas, George
    Scaman, Kevin
    Virmaux, Aladin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [35] Global Convolutional Neural Networks With Self-Attention for Fisheye Image Rectification
    Kim, Byunghyun
    Lee, Dohyun
    Min, Kyeongyuk
    Chong, Jongwha
    Joe, Inwhee
    IEEE ACCESS, 2022, 10 : 129580 - 129587
  • [36] Sparse self-attention aggregation networks for neural sequence slice interpolation
    Zejin Wang
    Jing Liu
    Xi Chen
    Guoqing Li
    Hua Han
    BioData Mining, 14
  • [37] ARCHITECTURE SELF-ATTENTION MECHANISM: NONLINEAR OPTIMIZATION FOR NEURAL ARCHITECTURE SEARCH
    Hao, Jie
    Zhu, William
    JOURNAL OF NONLINEAR AND VARIATIONAL ANALYSIS, 2021, 5 (01): : 119 - 140
  • [38] Combining convolutional neural networks and self-attention for fundus diseases identification
    Wang, Keya
    Xu, Chuanyun
    Li, Gang
    Zhang, Yang
    Zheng, Yu
    Sun, Chengjie
    SCIENTIFIC REPORTS, 2023, 13 (01)
  • [39] Original Music Generation using Recurrent Neural Networks with Self-Attention
    Jagannathan, Akash
    Chandrasekaran, Bharathi
    Dutta, Shubham
    Patil, Uma Rameshgouda
    Eirinaki, Magdalini
    2022 FOURTH IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE TESTING (AITEST 2022), 2022, : 56 - 63
  • [40] Spatial-Temporal Self-Attention for Asynchronous Spiking Neural Networks
    Wang, Yuchen
    Shi, Kexin
    Lu, Chengzhuo
    Liu, Yuguo
    Zhang, Malu
    Qu, Hong
    PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 3085 - 3093