Ensemble deep learning enhanced with self-attention for predicting immunotherapeutic responses to cancers

被引:44
|
作者
Jin, Wenyi [1 ]
Yang, Qian [2 ]
Chi, Hao [3 ]
Wei, Kongyuan [4 ]
Zhang, Pengpeng [5 ]
Zhao, Guodong [6 ]
Chen, Shi [2 ]
Xia, Zhijia [7 ]
Li, Xiaosong [2 ]
机构
[1] Wuhan Univ, Renmin Hosp, Dept Orthoped, Wuhan, Peoples R China
[2] Chongqing Med Univ, Affiliated Hosp 1, Clin Mol Med Testing Ctr, Chongqing, Peoples R China
[3] Southwest Med Univ, Clin Med Collage, Luzhou, Peoples R China
[4] Heidelberg Univ, Dept Gen Visceral & Transplantat Surg, Heidelberg, Germany
[5] Nanjing Med Univ, Affiliated Hosp 1, Dept Thorac Surg, Nanjing, Peoples R China
[6] Chinese Peoples Liberat Army PLA Gen Hosp, Med Ctr 1, Fac Hepatopancreatobiliary Surg, Beijing, Peoples R China
[7] Ludwig Maximilians Univ Munchen, Dept Gen Visceral & Transplant Surg, Munich, Germany
来源
FRONTIERS IN IMMUNOLOGY | 2022年 / 13卷
基金
中国国家自然科学基金;
关键词
deep learning; immunotherapy; cancer; PD1; PD-L1; ELISE; MODELS;
D O I
10.3389/fimmu.2022.1025330
中图分类号
R392 [医学免疫学]; Q939.91 [免疫学];
学科分类号
100102 ;
摘要
IntroductionDespite the many benefits immunotherapy has brought to patients with different cancers, its clinical applications and improvements are still hindered by drug resistance. Fostering a reliable approach to identifying sufferers who are sensitive to certain immunotherapeutic agents is of great clinical relevance. MethodsWe propose an ELISE (Ensemble Learning for Immunotherapeutic Response Evaluation) pipeline to generate a robust and highly accurate approach to predicting individual responses to immunotherapies. ELISE employed iterative univariable logistic regression to select genetic features of patients, using Monte Carlo Tree Search (MCTS) to tune hyperparameters. In each trial, ELISE selected multiple models for integration based on add or concatenate stacking strategies, including deep neural network, automatic feature interaction learning via self-attentive neural networks, deep factorization machine, compressed interaction network, and linear neural network, then adopted the best trial to generate a final approach. SHapley Additive exPlanations (SHAP) algorithm was applied to interpret ELISE, which was then validated in an independent test set. ResultRegarding prediction of responses to atezolizumab within esophageal adenocarcinoma (EAC) patients, ELISE demonstrated a superior accuracy (Area Under Curve [AUC] = 100.00%). AC005786.3 (Mean [|SHAP value|] = 0.0097) was distinguished as the most valuable contributor to ELISE output, followed by SNORD3D (0.0092), RN7SKP72 (0.0081), EREG (0.0069), IGHV4-80 (0.0063), and MIR4526 (0.0063). Mechanistically, immunoglobulin complex, immunoglobulin production, adaptive immune response, antigen binding and others, were downregulated in ELISE-neg EAC subtypes and resulted in unfavorable responses. More encouragingly, ELISE could be extended to accurately estimate the responsiveness of various immunotherapeutic agents against other cancers, including PD1/PD-L1 suppressor against metastatic urothelial cancer (AUC = 88.86%), and MAGE-A3 immunotherapy against metastatic melanoma (AUC = 100.00%). DiscussionThis study presented deep insights into integrating ensemble deep learning with self-attention as a mechanism for predicting immunotherapy responses to human cancers, highlighting ELISE as a potential tool to generate reliable approaches to individualized treatment.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Gridding and filtering method of gravity and magnetic data based on self-attention deep learning
    Ma G.
    Wang Z.
    Li L.
    Shiyou Diqiu Wuli Kantan/Oil Geophysical Prospecting, 2022, 57 (01): : 34 - 42
  • [42] Deep Learning Based on Hierarchical Self-Attention for Finance Distress Prediction Incorporating Text
    Ruan, Sumei
    Sun, Xusheng
    Yao, Ruanxingchen
    Li, Wei
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2021, 2021
  • [43] Multiple sequence alignment based on deep reinforcement learning with self-attention and positional encoding
    Liu, Yuhang
    Yuan, Hao
    Zhang, Qiang
    Wang, Zixuan
    Xiong, Shuwen
    Wen, Naifeng
    Zhang, Yongqing
    BIOINFORMATICS, 2023, 39 (11)
  • [44] DySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks
    Sankar, Aravind
    Wu, Yanhong
    Gou, Liang
    Zhang, Wei
    Yang, Hao
    PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 519 - 527
  • [45] Self-Attention Enhanced CNNs and Collaborative Curriculum Learning for Distantly Supervised Relation Extraction
    Huang, Yuyun
    Du, Jinhua
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 389 - 398
  • [46] Enhancing speech emotion recognition: a deep learning approach with self-attention and acoustic features
    Aghajani, Khadijeh
    Zohrevandi, Mahbanou
    JOURNAL OF SUPERCOMPUTING, 2025, 81 (05):
  • [47] Self-attention Adversarial Based Deep Subspace Clustering
    Yin M.
    Wu H.-Y.
    Xie S.-L.
    Yang Q.-Y.
    Zidonghua Xuebao/Acta Automatica Sinica, 2022, 48 (01): : 271 - 281
  • [48] Stochastic Economic Lot Scheduling via Self-Attention Based Deep Reinforcement Learning
    Song, Wen
    Mi, Nan
    Li, Qiqiang
    Zhuang, Jing
    Cao, Zhiguang
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2024, 21 (02) : 1457 - 1468
  • [49] A spectrum contextual self-attention deep learning network for hyperspectral inversion of soil metals
    Zhang, Tingyu
    Fu, Quan
    Tian, Runqing
    Zhang, Yang
    Sun, Zenghui
    ECOLOGICAL INDICATORS, 2023, 152
  • [50] A deep learning sequence model based on self-attention and convolution for wind power prediction
    Liu, Chien-Liang
    Chang, Tzu-Yu
    Yang, Jie-Si
    Huang, Kai-Bin
    RENEWABLE ENERGY, 2023, 219