Contrastive learning of protein representations with graph neural networks for structural and functional annotations

被引:0
|
作者
Luo, Jiaqi [1 ]
Luo, Yunan [2 ]
机构
[1] Tsinghua Univ, Inst Interdisciplinary Informat Sci, Beijing, Peoples R China
[2] Georgia Inst Technol, Sch Computat Sci & Engn, Atlanta, GA 30332 USA
关键词
Protein annotation; Protein structure and function; Deep learning; Graph neural network; Contrastive learning; Representation learning; SEQUENCE;
D O I
暂无
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Although protein sequence data is growing at an ever-increasing rate, the protein universe is still sparsely annotated with functional and structural annotations. Computational approaches have become efficient solutions to infer annotations for unlabeled proteins by transferring knowledge from proteins with experimental annotations. Despite the increasing availability of protein structure data and the high coverage of high-quality predicted structures, e.g., by AlphaFold, many existing computational tools still only rely on sequence data to predict structural or functional annotations, including alignment algorithms such as BLAST and several sequence-based deep learning models. Here, we develop PenLight, a general deep learning framework for protein structural and functional annotations. PenLight uses a graph neural network (GNN) to integrate 3D protein structure data and protein language model representations. In addition, PenLight applies a contrastive learning strategy to train the GNN for learning protein representations that reflect similarities beyond sequence identity, such as semantic similarities in the function or structure space. We bench-marked PenLight on a structural classification task and a functional annotation task, where PenLight achieved higher prediction accuracy and coverage than state-of-the-art methods.
引用
下载
收藏
页码:109 / 120
页数:12
相关论文
共 50 条
  • [1] Molecular contrastive learning of representations via graph neural networks
    Yuyang Wang
    Jianren Wang
    Zhonglin Cao
    Amir Barati Farimani
    Nature Machine Intelligence, 2022, 4 : 279 - 287
  • [2] Molecular contrastive learning of representations via graph neural networks
    Wang, Yuyang
    Wang, Jianren
    Cao, Zhonglin
    Farimani, Amir Barati
    NATURE MACHINE INTELLIGENCE, 2022, 4 (03) : 279 - 287
  • [3] Deep Neural Networks for Learning Graph Representations
    Cao, Shaosheng
    Lu, Wei
    Xu, Qiongkai
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 1145 - 1152
  • [4] Graph Transformer: Learning Better Representations for Graph Neural Networks
    Wang, Boyuan
    Cui, Lixin
    Bai, Lu
    Hancock, Edwin R.
    STRUCTURAL, SYNTACTIC, AND STATISTICAL PATTERN RECOGNITION, S+SSPR 2020, 2021, 12644 : 139 - 149
  • [5] Fast protein structure comparison through effective representation learning with contrastive graph neural networks
    Xia, Chunqiu
    Feng, Shi-Hao
    Xia, Ying
    Pan, Xiaoyong
    Shen, Hong-Bin
    PLOS COMPUTATIONAL BIOLOGY, 2022, 18 (03)
  • [6] Adaptive negative representations for graph contrastive learning
    Zhang, Qi
    Yang, Cheng
    Shi, Chuan
    AI OPEN, 2024, 5 : 79 - 86
  • [7] Contrastive Graph Learning with Graph Convolutional Networks
    Nagendar, G.
    Sitaram, Ramachandrula
    DOCUMENT ANALYSIS SYSTEMS, DAS 2022, 2022, 13237 : 96 - 110
  • [8] A Lightweight Method for Graph Neural Networks Based on Knowledge Distillation and Graph Contrastive Learning
    Wang, Yong
    Yang, Shuqun
    APPLIED SCIENCES-BASEL, 2024, 14 (11):
  • [9] Transferable Implicit Solvation via Contrastive Learning of Graph Neural Networks
    Airas, Justin
    Ding, Xinqiang
    Zhang, Bin
    ACS CENTRAL SCIENCE, 2023, 9 (12) : 2286 - 2297
  • [10] Multi-behavior contrastive learning with graph neural networks for recommendation
    Zhao, Zihan
    Tong, Xiangrong
    Wang, Yingjie
    Zhang, Qiang
    KNOWLEDGE-BASED SYSTEMS, 2024, 300