A Transformer-Embedded Multi-Task Model for Dose Distribution Prediction

被引:22
|
作者
Wen, Lu [1 ]
Xiao, Jianghong [2 ]
Tan, Shuai [1 ]
Wu, Xi [3 ]
Zhou, Jiliu [1 ]
Peng, Xingchen [4 ]
Wang, Yan [1 ]
机构
[1] Sichuan Univ, Sch Comp Sci, Chengdu, Peoples R China
[2] Sichuan Univ, Canc Ctr, Dept Radiat Oncol, West China Hosp, Chengdu, Peoples R China
[3] Chengdu Univ Informat Technol, Sch Comp Sci, Chengdu, Peoples R China
[4] Sichuan Univ, Canc Ctr, Dept Biotherapy, West China Hosp, Chengdu, Peoples R China
基金
中国国家自然科学基金;
关键词
Dose prediction; multi-task learning; isodose lines; gradient information; consistency constraint; transformer; INTENSITY-MODULATED RADIOTHERAPY; PLAN QUALITY; NECK-CANCER; NETWORK; HEAD;
D O I
10.1142/S0129065723500430
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Radiation therapy is a fundamental cancer treatment in the clinic. However, to satisfy the clinical requirements, radiologists have to iteratively adjust the radiotherapy plan based on experience, causing it extremely subjective and time-consuming to obtain a clinically acceptable plan. To this end, we introduce a transformer-embedded multi-task dose prediction (TransMTDP) network to automatically predict the dose distribution in radiotherapy. Specifically, to achieve more stable and accurate dose predictions, three highly correlated tasks are included in our TransMTDP network, i.e. a main dose prediction task to provide each pixel with a fine-grained dose value, an auxiliary isodose lines prediction task to produce coarse-grained dose ranges, and an auxiliary gradient prediction task to learn subtle gradient information such as radiation patterns and edges in the dose maps. The three correlated tasks are integrated through a shared encoder, following the multi-task learning strategy. To strengthen the connection of the output layers for different tasks, we further use two additional constraints, i.e. isodose consistency loss and gradient consistency loss, to reinforce the match between the dose distribution features generated by the auxiliary tasks and the main task. Additionally, considering many organs in the human body are symmetrical and the dose maps present abundant global features, we embed the transformer into our framework to capture the long-range dependencies of the dose maps. Evaluated on an in-house rectum cancer dataset and a public head and neck cancer dataset, our method gains superior performance compared with the state-of-the-art ones. Code is available at https://github.com/luuuwen/TransMTDP.
引用
收藏
页数:16
相关论文
共 50 条
  • [41] Hotspot Detection via Multi-task Learning and Transformer Encoder
    Zhu, Binwu
    Chen, Ran
    Zhang, Xinyun
    Yang, Fan
    Zeng, Xuan
    Yu, Bei
    Wong, Martin D. F.
    2021 IEEE/ACM INTERNATIONAL CONFERENCE ON COMPUTER AIDED DESIGN (ICCAD), 2021,
  • [42] Multi-task agency: a combinatorial model
    Bardsley, P
    JOURNAL OF ECONOMIC BEHAVIOR & ORGANIZATION, 2001, 44 (02) : 233 - 248
  • [43] Inverted Pyramid Multi-task Transformer for Dense Scene Understanding
    Ye, Hanrong
    Xu, Dan
    COMPUTER VISION - ECCV 2022, PT XXVII, 2022, 13687 : 514 - 530
  • [44] CoTexT: Multi-task Learning with Code-Text Transformer
    Long Phan
    Hieu Tran
    Le, Daniel
    Hieu Nguyen
    Anibal, James
    Peltekian, Alec
    Ye, Yanfang
    NLP4PROG 2021: THE 1ST WORKSHOP ON NATURAL LANGUAGE PROCESSING FOR PROGRAMMING (NLP4PROG 2021), 2021, : 40 - 47
  • [45] Multi-task learning by hierarchical Dirichlet mixture model for sparse failure prediction
    Simon Luo
    Victor W. Chu
    Zhidong Li
    Yang Wang
    Jianlong Zhou
    Fang Chen
    Raymond K. Wong
    International Journal of Data Science and Analytics, 2021, 12 : 15 - 29
  • [46] Multi-task learning by hierarchical Dirichlet mixture model for sparse failure prediction
    Luo, Simon
    Chu, Victor W.
    Li, Zhidong
    Wang, Yang
    Zhou, Jianlong
    Chen, Fang
    Wong, Raymond K.
    INTERNATIONAL JOURNAL OF DATA SCIENCE AND ANALYTICS, 2021, 12 (01) : 15 - 29
  • [47] Multi-task Envisioning Transformer-based Autoencoder for Corporate Credit Rating Migration Early Prediction
    Yue, Han
    Xia, Steve
    Liu, Hongfu
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 4452 - 4460
  • [48] Deep multi-task learning model for time series prediction in wireless communication
    Cao, Kailin
    Hu, Ting
    Li, Zishuo
    Zhao, Guoshuai
    Qian, Xueming
    PHYSICAL COMMUNICATION, 2021, 44
  • [49] MTLMetro: A Deep Multi-Task Learning Model for Metro Passenger Demands Prediction
    Huang, Hao
    Mao, Jiannan
    Liu, Ronghui
    Lu, Weike
    Tang, Tianli
    Liu, Lan
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2024, 25 (09) : 11805 - 11820
  • [50] ProPept-MT: A Multi-Task Learning Model for Peptide Feature Prediction
    He, Guoqiang
    He, Qingzu
    Cheng, Jinyan
    Yu, Rongwen
    Shuai, Jianwei
    Cao, Yi
    INTERNATIONAL JOURNAL OF MOLECULAR SCIENCES, 2024, 25 (13)