Siamese Recurrent Neural Network with a Self-Attention Mechanism for Bioactivity Prediction

被引:20
|
作者
Fernandez-Llaneza, Daniel [1 ]
Ulander, Silas [1 ]
Gogishvili, Dea [1 ]
Nittinger, Eva [1 ]
Zhao, Hongtao [1 ]
Tyrchan, Christian [1 ]
机构
[1] AstraZeneca, Dept Med Chem Res & Early Dev, Resp & Immunol, Biopharmaceut R&D, SE-43183 Molndal, Sweden
来源
ACS OMEGA | 2021年 / 6卷 / 16期
关键词
DRUG DISCOVERY;
D O I
10.1021/acsomega.1c01266
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Activity prediction plays an essential role in drug discovery by directing search of drug candidates in the relevant chemical space. Despite being applied successfully to image recognition and semantic similarity, the Siamese neural network has rarely been explored in drug discovery where modelling faces challenges such as insufficient data and class imbalance. Here, we present a Siamese recurrent neural network model (SiameseCHEM) based on bidirectional long short-term memory architecture with a self-attention mechanism, which can automatically learn discriminative features from the SMILES representations of small molecules. Subsequently, it is used to categorize bioactivity of small molecules via N-shot learning. Trained on random SMILES strings, it proves robust across five different datasets for the task of binary or categorical classification of bioactivity. Benchmarking against two baseline machine learning models which use the chemistry-rich ECFP fingerprints as the input, the deep learning model outperforms on three datasets and achieves comparable performance on the other two. The failure of both baseline methods on SMILES strings highlights that the deep learning model may learn task-specific chemistry features encoded in SMILES strings.
引用
收藏
页码:11086 / 11094
页数:9
相关论文
共 50 条
  • [21] Mandarin Recognition Based on Self-Attention Mechanism with Deep Convolutional Neural Network (DCNN)-Gated Recurrent Unit (GRU)
    Chen, Xun
    Wang, Chengqi
    Hu, Chao
    Wang, Qin
    Big Data and Cognitive Computing, 2024, 8 (12)
  • [22] A fine-grained classification method based on self-attention Siamese network
    He Can
    Yuan Guowu
    Wu Hao
    2021 THE 5TH INTERNATIONAL CONFERENCE ON VIDEO AND IMAGE PROCESSING, ICVIP 2021, 2021, : 148 - 154
  • [23] Self-attention Based Collaborative Neural Network for Recommendation
    Ma, Shengchao
    Zhu, Jinghua
    WIRELESS ALGORITHMS, SYSTEMS, AND APPLICATIONS, WASA 2019, 2019, 11604 : 235 - 246
  • [24] A new deep self-attention neural network for GNSS coordinate time series prediction
    Jiang, Weiping
    Wang, Jian
    Li, Zhao
    Li, Wudong
    Yuan, Peng
    GPS SOLUTIONS, 2024, 28 (01)
  • [25] Self-Attention Enhanced Recurrent Neural Networks for Sentence Classification
    Kumar, Ankit
    Rastogi , Reshma
    2018 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI), 2018, : 905 - 911
  • [26] A pagerank self-attention network for traffic flow prediction
    Kang, Ting
    Wang, Huaizhi
    Wu, Ting
    Peng, Jianchun
    Jiang, Hui
    FRONTIERS IN ENERGY RESEARCH, 2022, 10
  • [27] Multimodal Modules and Self-Attention for Graph Neural Network Molecular Properties Prediction Model
    Punnachaiya, Kamol
    Vateekul, Peerapon
    Wichadakul, Duangdao
    2023 11TH INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND COMPUTATIONAL BIOLOGY, ICBCB, 2023, : 141 - 146
  • [28] A new deep self-attention neural network for GNSS coordinate time series prediction
    Weiping Jiang
    Jian Wang
    Zhao Li
    Wudong Li
    Peng Yuan
    GPS Solutions, 2024, 28
  • [29] Graph neural network with self-attention for material discovery
    Chen, Xuesi
    Jiang, Hantong
    Lin, Xuanjie
    Ren, Yongsheng
    Wu, Congzhong
    Zhan, Shu
    Ma, Wenhui
    MOLECULAR PHYSICS, 2023, 121 (04)
  • [30] Micro LED defect detection with self-attention mechanism-based neural network
    Zhong, Zebang
    Li, Cheng
    Chen, Meiyun
    Wu, Heng
    Kiyoshi, Takamasu
    DIGITAL SIGNAL PROCESSING, 2024, 149