Learning Queuing Networks by Recurrent Neural Networks

被引:8
|
作者
Garbi, Giulio [1 ]
Incerto, Emilio [1 ]
Tribastone, Mirco [1 ]
机构
[1] IMT Sch Adv Studies Lucca, Lucca, Italy
关键词
software performance; queuing networks; recurrent neural networks; PERFORMANCE PREDICTION; INFERENCE; SYSTEMS; DEMAND;
D O I
10.1145/3358960.3379134
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
It is well known that building analytical performance models in practice is difficult because it requires a considerable degree of proficiency in the underlying mathematics. In this paper, we propose a machine-learning approach to derive performance models from data. We focus on queuing networks, and crucially exploit a deterministic approximation of their average dynamics in terms of a compact system of ordinary differential equations. We encode these equations into a recurrent neural network whose weights can be directly related to model parameters. This allows for an interpretable structure of the neural network, which can be trained from system measurements to yield a white-box parameterized model that can be used for prediction purposes such as what-if analyses and capacity planning. Using synthetic models as well as a real case study of a load-balancing system, we show the effectiveness of our technique in yielding models with high predictive power.
引用
下载
收藏
页码:56 / 66
页数:11
相关论文
共 50 条
  • [31] Learning Statistical Scripts with LSTM Recurrent Neural Networks
    Pichotta, Karl
    Mooney, Raymond J.
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 2800 - 2806
  • [32] A conjugate gradient learning algorithm for recurrent neural networks
    Chang, WF
    Mak, MW
    NEUROCOMPUTING, 1999, 24 (1-3) : 173 - 189
  • [33] Effect of complexity on learning ability of recurrent neural networks
    N. Honma
    K. Kitagawa
    K. Abe
    Artificial Life and Robotics, 1998, 2 (3) : 97 - 101
  • [34] On the Learning Capabilities of Recurrent Neural Networks: A Cryptographic Perspective
    Srivastava, Shivin
    Bhatia, Ashutosh
    2018 9TH IEEE INTERNATIONAL CONFERENCE ON BIG KNOWLEDGE (ICBK), 2018, : 162 - 167
  • [35] Sequence Metric Learning as Synchronization of Recurrent Neural Networks
    Compagnon, Paul
    Lefebvre, Gregoire
    Duffner, Stefan
    Garcia, Christophe
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [36] Efficient Online Learning with Spiral Recurrent Neural Networks
    Sollacher, Rudolf
    Gao, Huaien
    2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8, 2008, : 2551 - 2558
  • [37] Learning of Process Representations Using Recurrent Neural Networks
    Seeliger, Alexander
    Luettgen, Stefan
    Nolle, Timo
    Muehlhaeuser, Max
    ADVANCED INFORMATION SYSTEMS ENGINEERING (CAISE 2021), 2021, 12751 : 109 - 124
  • [38] "FORCE" learning in recurrent neural networks as data assimilation
    Duane, Gregory S.
    CHAOS, 2017, 27 (12)
  • [39] Learning to Learn and Compositionality with Deep Recurrent Neural Networks
    de Freitas, Nando
    KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, : 3 - 3
  • [40] Constrained Training of Recurrent Neural Networks for Automata Learning
    Aichernig, Bernhard K.
    Koenig, Sandra
    Mateis, Cristinel
    Pferscher, Andrea
    Schmidt, Dominik
    Tappler, Martin
    SOFTWARE ENGINEERING AND FORMAL METHODS, SEFM 2022, 2022, 13550 : 155 - 172