Model-Agnostic Event Log Augmentation for Predictive Process Monitoring

被引:2
|
作者
Kaeppel, Martin [1 ]
Jablonski, Stefan [1 ]
机构
[1] Univ Bayreuth, Inst Comp Sci, Bayreuth, Germany
关键词
Predictive Process Monitoring; Data Augmentation; Data Scarcity;
D O I
10.1007/978-3-031-34560-9_23
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Predictive process monitoring aims to predict how the execution of a running process instance will evolve until its completion. Deep learning techniques have been shown to perform well for various prediction tasks, such as next activity prediction, remaining time prediction, or outcome prediction. However, the quality and performance of these models is highly dependent on the available amount of training data, as deep learning models require a lot of data to generalize well. In practice, the available event logs usually contain only a few thousand records with more or less redundancy, which is insufficient with respect to the large number of parameters that need to be estimated during training. For this reason, data augmentation is often used in machine learning research to increase the amount of available training data by applying transformations to them and create new samples synthetically. Since data augmentation is still largely unexplored in predictive process monitoring, this paper proposes an initial set of simple noise-based transformations that could be applied to any event log and boosts the performance of existing predictive process monitoring approaches. Our experimental evaluation shows that predictive process monitoring approaches for predicting the next activity benefit from this data augmentation technique in terms of performance and stability of the training process.
引用
收藏
页码:381 / 397
页数:17
相关论文
共 50 条
  • [1] Model-Agnostic Augmentation for Accurate Graph Classification
    Yoo, Jaemin
    Shim, Sooyeon
    Kang, U.
    PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 1281 - 1291
  • [2] Event Log Sampling for Predictive Monitoring
    Sani, Mohammadreza Fani
    Vazifehdoostirani, Mozhgan
    Park, Gyunam
    Pegoraro, Marco
    van Zelst, Sebastiaan J.
    van der Aalst, Wil M. P.
    PROCESS MINING WORKSHOPS, ICPM 2021, 2022, 433 : 154 - 166
  • [3] Model-Agnostic Federated Learning
    Mittone, Gianluca
    Riviera, Walter
    Colonnelli, Iacopo
    Birke, Robert
    Aldinucci, Marco
    EURO-PAR 2023: PARALLEL PROCESSING, 2023, 14100 : 383 - 396
  • [4] Model-Agnostic Private Learning
    Bassily, Raef
    Thakkar, Om
    Thakurta, Abhradeep
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [5] Is Bayesian Model-Agnostic Meta Learning Better than Model-Agnostic Meta Learning, Provably?
    Chen, Lisha
    Chen, Tianyi
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [6] Model-agnostic variable importance for predictive uncertainty: an entropy-based approach
    Wood, Danny
    Papamarkou, Theodore
    Benatan, Matt
    Allmendinger, Richard
    DATA MINING AND KNOWLEDGE DISCOVERY, 2024, : 4184 - 4216
  • [7] SegRefiner: Towards Model-Agnostic Segmentation Refinement with Discrete Diffusion Process
    Wang, Mengyu
    Ding, Henghui
    Liew, Jun Hao
    Liu, Jiajun
    Zhao, Yao
    Wei, Yunchao
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [8] RelEx: A Model-Agnostic Relational Model Explainer
    Zhang, Yue
    Defazio, David
    Ramesh, Arti
    AIES '21: PROCEEDINGS OF THE 2021 AAAI/ACM CONFERENCE ON AI, ETHICS, AND SOCIETY, 2021, : 1042 - 1049
  • [9] Model-Agnostic Interpretability with Shapley Values
    Messalas, Andreas
    Kanellopoulos, Yiannis
    Makris, Christos
    2019 10TH INTERNATIONAL CONFERENCE ON INFORMATION, INTELLIGENCE, SYSTEMS AND APPLICATIONS (IISA), 2019, : 220 - 226
  • [10] A Model-Agnostic Heuristics for Selective Classification
    Pugnana, Andrea
    Ruggieri, Salvatore
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 8, 2023, : 9461 - 9469