Ensemble deep learning enhanced with self-attention for predicting immunotherapeutic responses to cancers

被引:44
|
作者
Jin, Wenyi [1 ]
Yang, Qian [2 ]
Chi, Hao [3 ]
Wei, Kongyuan [4 ]
Zhang, Pengpeng [5 ]
Zhao, Guodong [6 ]
Chen, Shi [2 ]
Xia, Zhijia [7 ]
Li, Xiaosong [2 ]
机构
[1] Wuhan Univ, Renmin Hosp, Dept Orthoped, Wuhan, Peoples R China
[2] Chongqing Med Univ, Affiliated Hosp 1, Clin Mol Med Testing Ctr, Chongqing, Peoples R China
[3] Southwest Med Univ, Clin Med Collage, Luzhou, Peoples R China
[4] Heidelberg Univ, Dept Gen Visceral & Transplantat Surg, Heidelberg, Germany
[5] Nanjing Med Univ, Affiliated Hosp 1, Dept Thorac Surg, Nanjing, Peoples R China
[6] Chinese Peoples Liberat Army PLA Gen Hosp, Med Ctr 1, Fac Hepatopancreatobiliary Surg, Beijing, Peoples R China
[7] Ludwig Maximilians Univ Munchen, Dept Gen Visceral & Transplant Surg, Munich, Germany
来源
FRONTIERS IN IMMUNOLOGY | 2022年 / 13卷
基金
中国国家自然科学基金;
关键词
deep learning; immunotherapy; cancer; PD1; PD-L1; ELISE; MODELS;
D O I
10.3389/fimmu.2022.1025330
中图分类号
R392 [医学免疫学]; Q939.91 [免疫学];
学科分类号
100102 ;
摘要
IntroductionDespite the many benefits immunotherapy has brought to patients with different cancers, its clinical applications and improvements are still hindered by drug resistance. Fostering a reliable approach to identifying sufferers who are sensitive to certain immunotherapeutic agents is of great clinical relevance. MethodsWe propose an ELISE (Ensemble Learning for Immunotherapeutic Response Evaluation) pipeline to generate a robust and highly accurate approach to predicting individual responses to immunotherapies. ELISE employed iterative univariable logistic regression to select genetic features of patients, using Monte Carlo Tree Search (MCTS) to tune hyperparameters. In each trial, ELISE selected multiple models for integration based on add or concatenate stacking strategies, including deep neural network, automatic feature interaction learning via self-attentive neural networks, deep factorization machine, compressed interaction network, and linear neural network, then adopted the best trial to generate a final approach. SHapley Additive exPlanations (SHAP) algorithm was applied to interpret ELISE, which was then validated in an independent test set. ResultRegarding prediction of responses to atezolizumab within esophageal adenocarcinoma (EAC) patients, ELISE demonstrated a superior accuracy (Area Under Curve [AUC] = 100.00%). AC005786.3 (Mean [|SHAP value|] = 0.0097) was distinguished as the most valuable contributor to ELISE output, followed by SNORD3D (0.0092), RN7SKP72 (0.0081), EREG (0.0069), IGHV4-80 (0.0063), and MIR4526 (0.0063). Mechanistically, immunoglobulin complex, immunoglobulin production, adaptive immune response, antigen binding and others, were downregulated in ELISE-neg EAC subtypes and resulted in unfavorable responses. More encouragingly, ELISE could be extended to accurately estimate the responsiveness of various immunotherapeutic agents against other cancers, including PD1/PD-L1 suppressor against metastatic urothelial cancer (AUC = 88.86%), and MAGE-A3 immunotherapy against metastatic melanoma (AUC = 100.00%). DiscussionThis study presented deep insights into integrating ensemble deep learning with self-attention as a mechanism for predicting immunotherapy responses to human cancers, highlighting ELISE as a potential tool to generate reliable approaches to individualized treatment.
引用
收藏
页数:13
相关论文
共 50 条
  • [31] LEARNING HIERARCHICAL SELF-ATTENTION FOR VIDEO SUMMARIZATION
    Liu, Yen-Ting
    Li, Yu-Jhe
    Yang, Fu-En
    Chen, Shang-Fu
    Wang, Yu-Chiang Frank
    2019 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2019, : 3377 - 3381
  • [32] Deep Multi-Instance Learning with Induced Self-Attention for Medical Image Classification
    Li, Zhenliang
    Yuan, Liming
    Xu, Haixia
    Cheng, Rui
    Wen, Xianbin
    2020 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE, 2020, : 446 - 450
  • [33] SAGSleepNet: A deep learning model for sleep staging based on self-attention graph of polysomnography
    Jin, Zheng
    Jia, Kebin
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 86
  • [34] Self-attention and adversary learning deep hashing network for cross-modal retrieval
    Chen, Shubai
    Wu, Song
    Wang, Li
    Yu, Zhenyang
    COMPUTERS & ELECTRICAL ENGINEERING, 2021, 93
  • [35] DenoiseNet: Deep Generator and Discriminator Learning Network With Self-Attention Applied to Ocean Data
    Mao, Mingqiu
    Wang, Huajun
    Nie, Peng
    Xiao, Shipeng
    Wu, Ruijie
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [36] NON-LOCAL SELF-ATTENTION STRUCTURE FOR FUNCTION APPROXIMATION IN DEEP REINFORCEMENT LEARNING
    Wang, Zhixiang
    Xiao, Xi
    Hu, Guangwu
    Yao, Yao
    Zhang, Dianyan
    Peng, Zhendong
    Li, Qing
    Xia, Shutao
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 3042 - 3046
  • [37] Deep Clustering Efficient Learning Network for Motion Recognition Based on Self-Attention Mechanism
    Ru, Tielin
    Zhu, Ziheng
    APPLIED SCIENCES-BASEL, 2023, 13 (05):
  • [38] A fault diagnosis algorithm for analog circuits based on self-attention mechanism deep learning
    Yang D.
    Wei J.
    Lin X.
    Liu M.
    Lu S.
    Yi Qi Yi Biao Xue Bao/Chinese Journal of Scientific Instrument, 2023, 44 (03): : 128 - 136
  • [39] Intelligent identification of cracks on concrete surface combining self-attention mechanism and deep learning
    Tursenhali, Hanat
    Lin, Hang
    Journal of Railway Science and Engineering, 2021, 18 (04) : 844 - 852
  • [40] A Cross-Project Defect Prediction Model Based on Deep Learning With Self-Attention
    Wen, Wanzhi
    Zhang, Ruinian
    Wang, Chuyue
    Shen, Chenqiang
    Yu, Meng
    Zhang, Suchuan
    Gao, Xinxin
    IEEE ACCESS, 2022, 10 : 110385 - 110401