Online Model-based Testing Under Uncertainty

被引:20
|
作者
Camilli, Matteo [1 ]
Bellettini, Carlo [1 ]
Gargantini, Angelo [2 ]
Scandurra, Patrizia [2 ]
机构
[1] Univ Milan, Dept Comp Sci, Milan, Italy
[2] Univ Bergamo, Dept Management Informat & Prod Engn, Bergamo, Italy
关键词
Uncertainty Quantification; Reliability under Uncertainty; Bayesian Calibration; Online Model-based Testing;
D O I
10.1109/ISSRE.2018.00015
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Modern software systems are required to operate in a highly uncertain and changing environment. They have to control the satisfaction of their requirements at run-time, and possibly adapt and cope with situations that have not been completely addressed at design-time. Software engineering methods and techniques are, more than ever, forced to deal with change and uncertainty (lack of knowledge) explicitly. For tackling the challenge posed by uncertainty in delivering more reliable systems, this paper proposes a novel online Model-based Testing technique that complements classic test case generation based on pseudo-random sampling strategies with an uncertainty-aware sampling strategy. To deal with system uncertainty during testing, the proposed strategy builds on an Inverse Uncertainty Quantification approach that is related to the discrepancy between the measured data at run-time (while the system executes) and a Markov Decision Process model describing the behavior of the system under test. To this purpose, a conformance game approach is adopted in which tests feed a Bayesian inference calibrator that continuously learns from test data to tune the system model and the system itself. A comparative evaluation between the proposed uncertainty-aware sampling policy and classical pseudo-random sampling policies is also presented using the Tele Assistance System running example, showing the differences in achieved accuracy and efficiency.
引用
收藏
页码:36 / 46
页数:11
相关论文
共 50 条
  • [1] Model-based diagnosis under structural uncertainty
    Menzel, W
    Schröder, I
    [J]. ECAI 1998: 13TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 1998, : 284 - 288
  • [2] Model-based Confidentiality Analysis under Uncertainty
    Hahner, Sebastian
    Bitschi, Tizian
    Walter, Maximilian
    Bure, Tomas
    Hnetynka, Petr
    Heinrich, Robert
    [J]. 2023 IEEE 20TH INTERNATIONAL CONFERENCE ON SOFTWARE ARCHITECTURE COMPANION, ICSA-C, 2023, : 256 - 263
  • [3] Model-based measurement uncertainty evaluation, with applications in testing
    Maurice G. Cox
    Michèle Desenfant
    Peter M. Harris
    Bernd R. L. Siebert
    [J]. Accreditation and Quality Assurance, 2003, 8 : 548 - 554
  • [4] Uncertainty-aware Exploration in Model-based Testing
    Camilli, Matteo
    Gargantini, Angelo
    Scandurra, Patrizia
    Trubiani, Catia
    [J]. 2021 14TH IEEE CONFERENCE ON SOFTWARE TESTING, VERIFICATION AND VALIDATION (ICST 2021), 2021, : 71 - 81
  • [5] Model-based measurement uncertainty evaluation, with applications in testing
    Cox, MG
    Desenfant, M
    Harris, PM
    Siebert, BRL
    [J]. ACCREDITATION AND QUALITY ASSURANCE, 2003, 8 (12) : 548 - 554
  • [6] Distributed Online Test Generation for Model-Based Testing
    Kanstren, Teemu
    Kekkonen, Tuomas
    [J]. 2013 20TH ASIA-PACIFIC SOFTWARE ENGINEERING CONFERENCE (APSEC 2013), VOL 1, 2013, : 255 - 262
  • [7] Model-based design of experiments under structural model uncertainty
    Quaglio, Marco
    Fraga, Eric S.
    Galvanin, Federico
    [J]. 27TH EUROPEAN SYMPOSIUM ON COMPUTER AIDED PROCESS ENGINEERING, PT A, 2017, 40A : 145 - 150
  • [8] A model-based decision aid for species protection under uncertainty
    Drechsler, M
    [J]. BIOLOGICAL CONSERVATION, 2000, 94 (01) : 23 - 30
  • [9] Model-Based Testing
    Schieferdecker, Ina
    [J]. IEEE SOFTWARE, 2012, 29 (01) : 14 - 18
  • [10] Model-based testing
    Le Traon, Yves
    Xie, Tao
    [J]. SOFTWARE TESTING VERIFICATION & RELIABILITY, 2023, 33 (02):