Multitask Learning for Sparse Failure Prediction

被引:2
|
作者
Luo, Simon [1 ,2 ]
Chu, Victor W. [3 ]
Li, Zhidong [2 ,4 ]
Wang, Yang [2 ,4 ]
Zhou, Jianlong [2 ,4 ]
Chen, Fang [2 ,4 ]
Wong, Raymond K. [5 ]
机构
[1] Univ Sydney, Sydney, NSW, Australia
[2] Nanyang Technol Univ, Singapore, Singapore
[3] CSIRO, Data61, Sydney, NSW, Australia
[4] Univ Technol Sydney, Ultimo, Australia
[5] Univ New South Wales, Kensington, NSW, Australia
关键词
Multi-task learning; Sparse predictions; Dirichlet process; Beta process; Failure predictions;
D O I
10.1007/978-3-030-16148-4_1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sparsity is a problem which occurs inherently in many real-world datasets. Sparsity induces an imbalance in data, which has an adverse effect on machine learning and hence reducing the predictability. Previously, strong assumptions were made by domain experts on the model parameters by using their experience to overcome sparsity, albeit assumptions are subjective. Differently, we propose a multi-task learning solution which is able to automatically learn model parameters from a common latent structure of the data from related domains. Despite related, datasets commonly have overlapped but dissimilar feature spaces and therefore cannot simply be combined into a single dataset. Our proposed model, namely hierarchical Dirichlet process mixture of hierarchical beta process (HDP-HBP), learns tasks with a common model parameter for the failure prediction model using hierarchical Dirichlet process. Our model uses recorded failure history to make failure predictions on a water supply network. Multi-task learning is used to gain additional information from the failure records of water supply networks managed by other utility companies to improve prediction in one network. We achieve superior accuracy for sparse predictions compared to previous state-of-the-art models and have demonstrated the capability to be used in risk management to proactively repair critical infrastructure.
引用
收藏
页码:3 / 14
页数:12
相关论文
共 50 条
  • [41] Sparse Subarray Design for Multitask Receivers
    Deligiannis, Anastasios
    Amin, Moeness
    Fabrizio, Giuseppe
    Lambotharan, Sangarapillai
    2018 IEEE RADAR CONFERENCE (RADARCONF18), 2018, : 742 - 747
  • [42] Improved graph-based multitask learning model with sparse sharing for quantitative structure-property relationship prediction of drug molecules
    Hu, Haoyang
    Bai, Yunke
    Yuan, Zhihong
    AICHE JOURNAL, 2023, 69 (02)
  • [43] Learning to Multitask
    Zhang, Yu
    Wei, Ying
    Yang, Qiang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [44] Multitask learning
    Caruana, R
    MACHINE LEARNING, 1997, 28 (01) : 41 - 75
  • [45] Multitask Learning
    Rich Caruana
    Machine Learning, 1997, 28 : 41 - 75
  • [46] Multitask defect prediction
    Ni, Chao
    Chen, Xiang
    Xia, Xin
    Gu, Qing
    Zhao, Yingquan
    JOURNAL OF SOFTWARE-EVOLUTION AND PROCESS, 2019, 31 (12)
  • [47] Online Proximal Learning Over Jointly Sparse Multitask Networks With l∞,1 Regularization
    Jin, Danqi
    Chen, Jie
    Richard, Cedric
    Chen, Jingdong
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 : 6319 - 6335
  • [48] Linearized and Kernelized Sparse Multitask Learning for Predicting Cognitive Outcomes in Alzheimer's Disease
    Liu, Xiaoli
    Cao, Peng
    Yang, Jinzhu
    Zhao, Dazhe
    COMPUTATIONAL AND MATHEMATICAL METHODS IN MEDICINE, 2018, 2018
  • [49] A Multitask Learning-Based Model for Gas Classification and Concentration Prediction
    Dai, Yang
    Xiong, Yin
    Lin, He
    Li, Yunlong
    Feng, Yunhao
    Luo, Wan
    Zhong, Xiaojiang
    IEEE SENSORS JOURNAL, 2024, 24 (07) : 11639 - 11650
  • [50] Distributed Typhoon Track Prediction Based on Complex Features and Multitask Learning
    Sun, Yongjiao
    Song, Yaning
    Qiao, Baiyou
    Li, Boyang
    COMPLEXITY, 2021, 2021