Fine-tuning of algorithms using fractional experimental designs and local search

被引:241
|
作者
Adenso-Díaz, B
Laguna, M
机构
[1] Univ Oviedo, Escuela Super Ingenieros Ind, Gijon 33204, Spain
[2] Univ Colorado, Leeds Sch Business, Boulder, CO 80309 USA
关键词
D O I
10.1287/opre.1050.0243
中图分类号
C93 [管理学];
学科分类号
12 ; 1201 ; 1202 ; 120202 ;
摘要
Researchers and practitioners frequently spend more time fine-tuning algorithms than designing and implementing them. This is particularly true when developing heuristics and metaheuristics, where the "right" choice of values for search parameters has a considerable effect on the performance of the procedure. When testing metaheuristics, performance typically is measured considering both the quality of the solutions obtained and the time needed to find them. In this paper, we describe the development of CALIBRA, a procedure that attempts to find the best values for up to five search parameters associated with a procedure under study. Because CALIBRA uses Taguchi's fractional factorial experimental designs coupled with a local search procedure, the best values found are not guaranteed to be optimal. We test CALIBRA on six existing heuristic-based procedures. These experiments show that CALIBRA is able to find parameter values that either match or improve the performance of the procedures resulting from using the parameter values suggested by their developers. The latest version of CALIBRA can be downloaded for free from the website that appears in the online supplement of this paper at http://or.pubs.informs.org/Pages.collect.html.
引用
收藏
页码:99 / 114
页数:16
相关论文
共 50 条
  • [41] Fine-tuning memory macros using variable internal delays
    Gray, K
    IEEE SPECTRUM, 1999, 36 (08) : 44 - 49
  • [42] Frequency Extension of Radio Propagation Model Using Fine-Tuning
    Nagao, Tatsuya
    Hayashi, Takahiro
    IEICE COMMUNICATIONS EXPRESS, 2020, 12 (09): : 499 - 504
  • [43] Detection of abnormal fish by image recognition using fine-tuning
    Ryusei Okawa
    Nobuo Iwasaki
    Kazuya Okamoto
    David Marsh
    Artificial Life and Robotics, 2023, 28 : 175 - 180
  • [44] Gemstone classification using ConvNet with transfer learning and fine-tuning
    Freire, Willian M.
    Amaral, Aline M. M. M.
    Costa, Yandre M. G.
    2022 29TH INTERNATIONAL CONFERENCE ON SYSTEMS, SIGNALS AND IMAGE PROCESSING (IWSSIP), 2022,
  • [45] OPUS-CAT: Desktop NMT with CAT integration and local fine-tuning
    Nieminen, Tommi
    EACL 2021: THE 16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: PROCEEDINGS OF THE SYSTEM DEMONSTRATIONS, 2021, : 288 - 294
  • [46] Fine-tuning gas separation performance of copolymer polyimide by the regulation of local microstructure
    Huang, Yujie
    Li, Kaihua
    Zhang, Yue
    Wang, Ge
    Ma, Yuchao
    Jiao, Long
    Shu, Dengkun
    Yang, Shuo
    Ma, Xiaohua
    Zhang, Qian
    Yang, Leixin
    Cheng, Bowen
    JOURNAL OF MEMBRANE SCIENCE, 2025, 718
  • [47] Rollback Ensemble With Multiple Local Minima in Fine-Tuning Deep Learning Networks
    Ro, Youngmin
    Choi, Jongwon
    Heo, Byeongho
    Choi, Jin Young
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (09) : 4648 - 4660
  • [48] Fine-Tuning Heat Stress Algorithms to Optimise Global Predictions of Mass Coral Bleaching
    Lachs, Liam
    Bythell, John C.
    East, Holly K.
    Edwards, Alasdair J.
    Mumby, Peter J.
    Skirving, William J.
    Spady, Blake L.
    Guest, James R.
    REMOTE SENSING, 2021, 13 (14)
  • [49] The search for optimal dual antiplatelet therapy after PCI: fine-tuning of initiation and duration
    Luescher, Thomas F.
    EUROPEAN HEART JOURNAL, 2016, 37 (04) : 319 - 321
  • [50] Ensemble Knowledge Guided Sub-network Search and Fine-Tuning for Filter Pruning
    Lee, Seunghyun
    Song, Byung Cheol
    COMPUTER VISION, ECCV 2022, PT XI, 2022, 13671 : 569 - 585