Contextual Self-attentive Temporal Point Process for Physical Decommissioning Prediction of Cloud Assets

被引:0
|
作者
Yang, Fangkai [1 ]
Zhang, Jue [1 ]
Wang, Lu [1 ]
Qiao, Bo [1 ]
Weng, Di [1 ]
Qin, Xiaoting [1 ]
Weber, Gregory [2 ]
Das, Durgesh Nandini [2 ]
Rakhunathan, Srinivasan [2 ]
Srikanth, Ranganathan [2 ]
Lin, Qingwei [1 ]
Zhang, Dongmei [1 ]
机构
[1] Microsoft, Beijing, Peoples R China
[2] Microsoft, Redmond, WA USA
关键词
temporal point process; cloud asset decommission; sequence prediction; deep learning;
D O I
10.1145/3580305.3599794
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
As cloud computing continues to expand globally, the need for effective management of decommissioned cloud assets in data centers becomes increasingly important. This work focuses on predicting the physical decommissioning date of cloud assets as a crucial component in reverse cloud supply chain management and data center warehouse operation. The decommissioning process is modeled as a contextual self-attentive temporal point process, which incorporates contextual information to model sequences with parallel events and provides more accurate predictions with more seen historical data. We conducted extensive offline and online experiments in 20 sampled data centers. The results show that the proposed methodology achieves the best performance compared with baselines and improves remarkable 94% prediction accuracy in online experiments. This modeling methodology can be extended to other domains with similar workflow-like processes.
引用
收藏
页码:5372 / 5381
页数:10
相关论文
共 50 条
  • [1] PROACTIVE: Self-Attentive Temporal Point Process Flows for Activity Sequences
    Gupta, Vinayak
    Bedathur, Srikanta
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 496 - 504
  • [2] Deep Fourier Kernel for Self-Attentive Point Processes
    Zhu, Shixiang
    Zhang, Minghe
    Ding, Ruyi
    Xie, Yao
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [3] Self-Attentive Moving Average for Time Series Prediction
    Su, Yaxi
    Cui, Chaoran
    Qu, Hao
    APPLIED SCIENCES-BASEL, 2022, 12 (07):
  • [4] Explicit Sparse Self-Attentive Network for CTR Prediction
    Luo, Yu
    Peng, Wanwan
    Fan, Youping
    Pang, Hong
    Xu, Xiang
    Wu, Xiaohua
    PROCEEDINGS OF THE 10TH INTERNATIONAL CONFERENCE OF INFORMATION AND COMMUNICATION TECHNOLOGY, 2021, 183 : 690 - 695
  • [5] Image Inpainting Using Contextual Feature Adjustment and Joint Self-Attentive
    Peng, Hao
    Li, Xiaoming
    Computer Engineering and Applications, 2023, 59 (19) : 184 - 191
  • [6] Goal-driven Self-Attentive Recurrent Networks for Trajectory Prediction
    Chiara, Luigi Filippo
    Coscia, Pasquale
    Das, Sourav
    Calderara, Simone
    Cucchiara, Rita
    Ballan, Lamberto
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2022, 2022, : 2517 - 2526
  • [7] Non-local Self-attentive Autoencoder for Genetic Functionality Prediction
    Li, Yun
    Liu, Zhe
    Yao, Lina
    He, Zihuai
    CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 2117 - 2120
  • [8] Accurate disaster entity recognition based on contextual embeddings in self-attentive BiLSTM-CRF
    Hafsa, Noor E.
    Alzoubi, Hadeel Mohammed
    Almutlq, Atikah Saeed
    PLOS ONE, 2025, 20 (03):
  • [9] Enhancing wind power prediction with self-attentive variational autoencoders: A comparative study
    Harrou, Fouzi
    Dairi, Abdelkader
    Dorbane, Abdelhakim
    Sun, Ying
    RESULTS IN ENGINEERING, 2024, 23
  • [10] Disentangled Self-Attentive Neural Networks for Click-Through Rate Prediction
    Xu, Yichen
    Zhu, Yanqiao
    Yu, Feng
    Liu, Qiang
    Wu, Shu
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 3553 - 3557