Discovery of regulatory motifs in 5' untranslated regions using interpretable multi-task learning models

被引:5
|
作者
Zheng, Weizhong [1 ]
Fong, John H. C. [1 ]
Wan, Yuk Kei [1 ]
Chu, Athena H. Y. [1 ,2 ]
Huang, Yuanhua [1 ,3 ,4 ]
Wong, Alan S. L. [1 ,2 ,5 ]
Ho, Joshua W. K. [1 ,6 ]
机构
[1] Univ Hong Kong, Li Ka Shing Fac Med, Sch Biomed Sci, Hong Kong, Peoples R China
[2] Ctr Oncol & Immunol, Hong Kong Sci Pk, Hong Kong, Peoples R China
[3] Univ Hong Kong, Dept Stat & Actuarial Sci, Hong Kong, Peoples R China
[4] Ctr Translat Stem Cell Biol, Hong Kong Sci & Technol Pk, Hong Kong, Peoples R China
[5] Univ Hong Kong, Dept Elect & Elect Engn, Hong Kong, Peoples R China
[6] Lab Data Discovery Hlth D24H Ltd, Hong Kong Sci Pk, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
TRANSLATION INITIATION; PROTEIN; SEQUENCES;
D O I
10.1016/j.cels.2023.10.011
中图分类号
Q5 [生物化学]; Q7 [分子生物学];
学科分类号
071010 ; 081704 ;
摘要
The sequence in the 5' untranslated regions (UTRs) is known to affect mRNA translation rates. However, the underlying regulatory grammar remains elusive. Here, we propose MTtrans, a multi-task translation rate predictor capable of learning common sequence patterns from datasets across various experimental techniques. The core premise is that common motifs are more likely to be genuinely involved in translation control. MTtrans outperforms existing methods in both accuracy and the ability to capture transferable motifs across species, highlighting its strength in identifying evolutionarily conserved sequence motifs. Our independent fluorescence-activated cell sorting coupled with deep sequencing (FACS-seq) experiment validates the impact of most motifs identified by MTtrans. Additionally, we introduce "GRU-rewiring,"a technique to interpret the hidden states of the recurrent units. Gated recurrent unit (GRU)-rewiring allows us to identify regulatory element-enriched positions and examine the local effects of 5' UTR mutations. MTtrans is a powerful tool for deciphering the translation regulatory motifs.
引用
收藏
页码:1103 / 1112.e6
页数:17
相关论文
共 50 条
  • [1] Towards Interpretable Multi-task Learning Using Bilevel Programming
    Alesiani, Francesco
    Yu, Shujian
    Shaker, Ammar
    Yin, Wenzhe
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2020, PT II, 2021, 12458 : 593 - 608
  • [2] Benchmarking multi-task learning in predictive models for drug discovery
    McCabe, Philippa Grace
    Ortega-Martorell, Sandra
    Olier, Ivan
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [3] Deep Multi-task Learning for Interpretable Glaucoma Detection
    Mojab, Nooshin
    Noroozi, Vahid
    Yu, Philip S.
    Hallak, Joelle A.
    2019 IEEE 20TH INTERNATIONAL CONFERENCE ON INFORMATION REUSE AND INTEGRATION FOR DATA SCIENCE (IRI 2019), 2019, : 167 - 174
  • [4] A multi-task learning model for clinically interpretable sesamoiditis grading
    Guo, Li
    Tahir, Anas M.
    Hore, Michael
    Collins, Andrew
    Rideout, Andrew
    Wang, Z. Jane
    Computers in Biology and Medicine, 2024, 182
  • [5] Sacroiliitis diagnosis based on interpretable features and multi-task learning
    Liu, Lei
    Zhang, Haoyu
    Zhang, Weifeng
    Mei, Wei
    Huang, Ruibin
    PHYSICS IN MEDICINE AND BIOLOGY, 2024, 69 (04):
  • [6] Additive multi-task learning models and task diagnostics
    Miller, Nikolay
    Zhang, Guoyi
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2024, 53 (12) : 6120 - 6137
  • [7] Multi-task learning for medical foundation models
    Yang, Jiancheng
    NATURE COMPUTATIONAL SCIENCE, 2024, 4 (07): : 473 - 474
  • [8] Drug knowledge discovery via multi-task learning and pre-trained models
    Li, Dongfang
    Xiong, Ying
    Hu, Baotian
    Tang, Buzhou
    Peng, Weihua
    Chen, Qingcai
    BMC MEDICAL INFORMATICS AND DECISION MAKING, 2021, 21 (SUPPL 9)
  • [9] Drug knowledge discovery via multi-task learning and pre-trained models
    Dongfang Li
    Ying Xiong
    Baotian Hu
    Buzhou Tang
    Weihua Peng
    Qingcai Chen
    BMC Medical Informatics and Decision Making, 21
  • [10] Interpretable Multi-Task Learning for Product Quality Prediction with Attention Mechanism
    Yeh, Cheng-Han
    Fan, Yao-Chung
    Peng, Wen-Chih
    2019 IEEE 35TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2019), 2019, : 1910 - 1921