GreenNAS: A Green Approach to the Hyperparameters Tuning in Deep Learning

被引:2
|
作者
Franchini, Giorgia [1 ]
机构
[1] Univ Modena & Reggio Emilia, Dept Sci Phys Informat & Math, I-41125 Modena, Italy
关键词
neural deep learning; convolutional neural networks; neural architecture search; hyperparameters tuning; performance predictor; GreenAI;
D O I
10.3390/math12060850
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
This paper discusses the challenges of the hyperparameter tuning in deep learning models and proposes a green approach to the neural architecture search process that minimizes its environmental impact. The traditional approach of neural architecture search involves sweeping the entire space of possible architectures, which is computationally expensive and time-consuming. Recently, to address this issue, performance predictors have been proposed to estimate the performance of different architectures, thereby reducing the search space and speeding up the exploration process. The proposed approach aims to develop a performance predictor by training only a small percentage of the possible hyperparameter configurations. The suggested predictor can be queried to find the best configurations without training them on the dataset. Numerical examples of image denoising and classification enable us to evaluate the performance of the proposed approach in terms of performance and time complexity.
引用
收藏
页数:16
相关论文
共 50 条
  • [31] Toward Auto-Learning Hyperparameters for Deep Learning-Based Recommender Systems
    Sun, Bo
    Wu, Di
    Shang, Mingsheng
    He, Yi
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2022, PT II, 2022, : 323 - 331
  • [32] Optimization of Deep Learning Hyperparameters with Experimental Design in Exchange Rate Prediction
    Midilli, Yunus Emre
    Parsutins, Sergejs
    2020 61ST INTERNATIONAL SCIENTIFIC CONFERENCE ON INFORMATION TECHNOLOGY AND MANAGEMENT SCIENCE OF RIGA TECHNICAL UNIVERSITY (ITMS), 2020,
  • [33] Improving Hardenability Modeling: A Bayesian Optimization Approach to Tuning Hyperparameters for Neural Network Regression
    Gemechu, Wendimu Fanta
    Sitek, Wojciech
    Batalha, Gilmar Ferreira
    APPLIED SCIENCES-BASEL, 2024, 14 (06):
  • [34] Faster MIL-based Subgoal Identification for Reinforcement Learning by Tuning Fewer Hyperparameters
    Sunel, Saim
    Cilden, Erkin
    Polat, Faruk
    ACM TRANSACTIONS ON AUTONOMOUS AND ADAPTIVE SYSTEMS, 2024, 19 (02)
  • [35] Automated machine learning hyperparameters tuning through meta-guided Bayesian optimization
    Garouani, Moncef
    Bouneffa, Mourad
    PROGRESS IN ARTIFICIAL INTELLIGENCE, 2024,
  • [36] Automatic tuning of hyperparameters using Bayesian optimization
    Victoria, A. Helen
    Maragatham, G.
    EVOLVING SYSTEMS, 2021, 12 (01) : 217 - 223
  • [37] Tuning of Hyperparameters for FIR models - an Asymptotic Theory
    Mu, Biqiang
    Chen, Tianshi
    Ljung, Lennart
    IFAC PAPERSONLINE, 2017, 50 (01): : 2818 - 2823
  • [38] Automatic tuning of hyperparameters using Bayesian optimization
    A. Helen Victoria
    G. Maragatham
    Evolving Systems, 2021, 12 : 217 - 223
  • [39] A Deep Transfer Learning Approach to Fine-Tuning Facial Recognition Models
    Luttrell, Joseph
    Zhou, Zhaoxian
    Zhang, Yuanyuan.
    Zhang, Chaoyang
    Gong, Ping
    Yang, Bei
    Li, Runzhi
    PROCEEDINGS OF THE 2018 13TH IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS (ICIEA 2018), 2018, : 2671 - 2676