Signal Peptides Generated by Attention-Based Neural Networks

被引:53
|
作者
Wu, Zachary [1 ]
Yang, Kevin K. [1 ]
Liszka, Michael J. [2 ]
Lee, Alycia [3 ]
Batzilla, Alina [2 ]
Wernick, David [2 ]
Weiner, David P. [2 ]
Arnold, Frances H. [1 ]
机构
[1] CALTECH, Dept Chem & Chem Engn, Pasadena, CA 91125 USA
[2] BASF Enzymes, San Diego, CA 92121 USA
[3] CALTECH, Dept Computat & Math Sci, Pasadena, CA 91125 USA
来源
ACS SYNTHETIC BIOLOGY | 2020年 / 9卷 / 08期
基金
美国国家科学基金会;
关键词
machine learning; signal peptides; protein design; Bacillus subtilis; secretion; RECOMBINANT PROTEIN SECRETION; BACILLUS-SUBTILIS; TRANSLOCATION; OPTIMIZATION; TOOL;
D O I
10.1021/acssynbio.0c00219
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Short (15-30 residue) chains of amino acids at the amino termini of expressed proteins known as signal peptides (SPs) specify secretion in living cells. We trained an attention-based neural network, the Transformer model, on data from all available organisms in Swiss-Prot to generate SP sequences. Experimental testing demonstrates that the model-generated SPs are functional: when appended to enzymes expressed in an industrial Bacillus subtilis strain, the SPs lead to secreted activity that is competitive with industrially used SPs. Additionally, the model-generated SPs are diverse in sequence, sharing as little as 58% sequence identity to the closest known native signal peptide and 73% +/- 9% on average.
引用
收藏
页码:2154 / 2161
页数:8
相关论文
共 50 条
  • [1] Attention-based graph neural networks: a survey
    Chengcheng Sun
    Chenhao Li
    Xiang Lin
    Tianji Zheng
    Fanrong Meng
    Xiaobin Rui
    Zhixiao Wang
    [J]. Artificial Intelligence Review, 2023, 56 : 2263 - 2310
  • [2] Attention-based graph neural networks: a survey
    Sun, Chengcheng
    Li, Chenhao
    Lin, Xiang
    Zheng, Tianji
    Meng, Fanrong
    Rui, Xiaobin
    Wang, Zhixiao
    [J]. ARTIFICIAL INTELLIGENCE REVIEW, 2023, 56 (SUPPL 2) : 2263 - 2310
  • [3] Attention-based Convolutional Neural Networks for Sentence Classification
    Zhao, Zhiwei
    Wu, Youzheng
    [J]. 17TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2016), VOLS 1-5: UNDERSTANDING SPEECH PROCESSING IN HUMANS AND MACHINES, 2016, : 705 - 709
  • [4] Seizure localisation with attention-based graph neural networks
    Grattarola, Daniele
    Livi, Lorenzo
    Alippi, Cesare
    Wennberg, Richard
    Valiante, Taufik A.
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2022, 203
  • [5] Causal Discovery with Attention-Based Convolutional Neural Networks
    Nauta, Meike
    Bucur, Doina
    Seifert, Christin
    [J]. MACHINE LEARNING AND KNOWLEDGE EXTRACTION, 2019, 1 (01):
  • [6] Demystifying Oversmoothing in Attention-Based Graph Neural Networks
    Wu, Xinyi
    Ajorlou, Amir
    Wu, Zihui
    Jadbabaie, Ali
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [7] Attention-based neural networks for trust evaluation in online social networks
    Xu, Yanwei
    Feng, Zhiyong
    Zhou, Xian
    Xing, Meng
    Wu, Hongyue
    Xue, Xiao
    Chen, Shizhan
    Wang, Chao
    Qi, Lianyong
    [J]. INFORMATION SCIENCES, 2023, 630 : 507 - 522
  • [8] Chroma Intra Prediction With Lightweight Attention-Based Neural Networks
    Zou, Chengyi
    Wan, Shuai
    Ji, Tiannan
    Blanch, Marc Gorriz
    Mrak, Marta
    Herranz, Luis
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (01) : 549 - 560
  • [9] NASABN: A Neural Architecture Search Framework for Attention-Based Networks
    Jing, Kun
    Xu, Jungang
    Xu, Hui
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [10] Applied attention-based LSTM neural networks in stock prediction
    Cheng, Li-Chen
    Huang, Yu-Hsiang
    Wu, Mu-En
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2018, : 4716 - 4718