Enhancing Multimodal Patterns in Neuroimaging by Siamese Neural Networks with Self-Attention Mechanism

被引:13
|
作者
Arco, Juan E. [1 ,2 ,3 ]
Ortiz, Andres [2 ,3 ]
Gallego-Molina, Nicolas J. [2 ,3 ]
Gorriz, Juan M. [1 ,3 ]
Ramirez, Javier [1 ,3 ]
机构
[1] Univ Granada, Dept Signal Theory Networking & Commun, Granada 18010, Spain
[2] Univ Malaga, Dept Commun Engn, Malaga 29010, Spain
[3] Andalusian Res Inst Data Sci & Computat Intellige, Granada, Spain
关键词
Multimodal combination; siamese neural network; self-attention; deep learning; medical imaging; ALZHEIMERS-DISEASE; FUNCTIONAL CONNECTIVITY; MATTER LOSS; DIAGNOSIS; FUSION; MULTISCALE; MRI;
D O I
10.1142/S0129065723500193
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The combination of different sources of information is currently one of the most relevant aspects in the diagnostic process of several diseases. In the field of neurological disorders, different imaging modalities providing structural and functional information are frequently available. Those modalities are usually analyzed separately, although a joint of the features extracted from both sources can improve the classification performance of Computer-Aided Diagnosis (CAD) tools. Previous studies have computed independent models from each individual modality and combined them in a subsequent stage, which is not an optimum solution. In this work, we propose a method based on the principles of siamese neural networks to fuse information from Magnetic Resonance Imaging (MRI) and Positron Emission Tomography (PET). This framework quantifies the similarities between both modalities and relates them with the diagnostic label during the training process. The resulting latent space at the output of this network is then entered into an attention module in order to evaluate the relevance of each brain region at different stages of the development of Alzheimer's disease. The excellent results obtained and the high flexibility of the method proposed allow fusing more than two modalities, leading to a scalable methodology that can be used in a wide range of contexts.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] Siamese Recurrent Neural Network with a Self-Attention Mechanism for Bioactivity Prediction
    Fernandez-Llaneza, Daniel
    Ulander, Silas
    Gogishvili, Dea
    Nittinger, Eva
    Zhao, Hongtao
    Tyrchan, Christian
    ACS OMEGA, 2021, 6 (16): : 11086 - 11094
  • [2] Self-Attention based Siamese Neural Network recognition Model
    Liu, Yuxing
    Chang, Geng
    Fu, Guofeng
    Wei, Yingchao
    Lan, Jie
    Liu, Jiarui
    2022 34TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2022, : 721 - 724
  • [3] EPILEPTIC SPIKE DETECTION BY RECURRENT NEURAL NETWORKS WITH SELF-ATTENTION MECHANISM
    Fukumori, Kosuke
    Yoshida, Noboru
    Sugano, Hidenori
    Nakajima, Madoka
    Tanaka, Toshihisa
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 1406 - 1410
  • [4] Multimodal Fusion Method Based on Self-Attention Mechanism
    Zhu, Hu
    Wang, Ze
    Shi, Yu
    Hua, Yingying
    Xu, Guoxia
    Deng, Lizhen
    WIRELESS COMMUNICATIONS & MOBILE COMPUTING, 2020, 2020
  • [5] Convolutional Recurrent Neural Networks with a Self-Attention Mechanism for Personnel Performance Prediction
    Xue, Xia
    Feng, Jun
    Gao, Yi
    Liu, Meng
    Zhang, Wenyu
    Sun, Xia
    Zhao, Aiqi
    Guo, Shouxi
    ENTROPY, 2019, 21 (12)
  • [6] Leveraging Local and Global Patterns for Self-Attention Networks
    Xu, Mingzhou
    Wong, Derek F.
    Yang, Baosong
    Zhang, Yue
    Chao, Lidia S.
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 3069 - 3075
  • [7] Continuous Self-Attention Models with Neural ODE Networks
    Zhang, Jing
    Zhang, Peng
    Kong, Baiwen
    Wei, Junqiu
    Jiang, Xin
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14393 - 14401
  • [8] Quantum self-attention neural networks for text classification
    Li, Guangxi
    Zhao, Xuanqiang
    Wang, Xin
    SCIENCE CHINA-INFORMATION SCIENCES, 2024, 67 (04)
  • [9] Quantum self-attention neural networks for text classification
    Guangxi LI
    Xuanqiang ZHAO
    Xin WANG
    Science China(Information Sciences), 2024, 67 (04) : 301 - 313
  • [10] Detecting phishing websites through improving convolutional neural networks with Self-Attention mechanism
    Said, Yahia
    Alsheikhy, Ahmed A.
    Lahza, Husam
    Shawly, Tawfeeq
    AIN SHAMS ENGINEERING JOURNAL, 2024, 15 (04)