Meta Learning for Hyperparameter Optimization in Dialogue System

被引:22
|
作者
Chien, Jen-Tzung [1 ]
Lieow, Wei Xiang [1 ]
机构
[1] Natl Chiao Tung Univ, Dept Elect & Comp Engn, Hsinchu, Taiwan
来源
关键词
dialogue system; meta learning; Bayesian optimization; recurrent neural network;
D O I
10.21437/Interspeech.2019-1383
中图分类号
R36 [病理学]; R76 [耳鼻咽喉科学];
学科分类号
100104 ; 100213 ;
摘要
The performance of dialogue system based on deep reinforcement learning (DRL) highly depends on the selected hyperparameters in DRL algorithms. Traditionally, Gaussian process (GP) provides a probabilistic approach to Bayesian optimization for sequential search which is beneficial to select optimal hyperparameter. However, GP suffers from the expanding computation when the dimension of hyperparameters and the number of search points are increased. This paper presents a meta learning approach to carry out multifidelity Bayesian optimization where a two-level recurrent neural network (RNN) is developed for sequential learning and optimization. The search space is explored via the first-level RNN with cheap and low fidelity over a global region of hyperparameters. The optimization is then exploited and leveraged by the second-level RNN with a high fidelity on the successively small regions. The experiments on the hyperparameter optimization for dialogue system based on the deep Q network show the effectiveness and efficiency by using the proposed multifidelity Bayesian optimization.
引用
收藏
页码:839 / 843
页数:5
相关论文
共 50 条
  • [21] Genetic CFL: Hyperparameter Optimization in Clustered Federated Learning
    Agrawal, Shaashwat
    Sarkar, Sagnik
    Alazab, Mamoun
    Maddikunta, Praveen Kumar Reddy
    Gadekallu, Thippa Reddy
    Quoc-Viet Pham
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2021, 2021
  • [22] Online Hyperparameter Optimization for Class-Incremental Learning
    Liu, Yaoyao
    Li, Yingying
    Schiele, Bernt
    Sun, Qianru
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 7, 2023, : 8906 - 8913
  • [23] An Approach to Hyperparameter Optimization for the Objective Function in Machine Learning
    Kim, Yonghoon
    Chung, Mokdong
    ELECTRONICS, 2019, 8 (11)
  • [24] Hyperparameter Optimization of the Machine Learning Model for Distillation Processes
    Oh, Kwang Cheol
    Kwon, Hyukwon
    Park, Sun Yong
    Kim, Seok Jun
    Kim, Junghwan
    Kim, Daehyun
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2024, 2024
  • [25] Hyperparameter Optimization for Deep Residual Learning in Image Classification
    Jafar, Abbas
    Myungho, Lee
    2020 IEEE INTERNATIONAL CONFERENCE ON AUTONOMIC COMPUTING AND SELF-ORGANIZING SYSTEMS COMPANION (ACSOS-C 2020), 2020, : 24 - 29
  • [26] On hyperparameter optimization of machine learning algorithms: Theory and practice
    Yang, Li
    Shami, Abdallah
    NEUROCOMPUTING, 2020, 415 : 295 - 316
  • [27] PriorBand: Practical Hyperparameter Optimization in the Age of Deep Learning
    Mallik, Neeratyoy
    Bergman, Edward
    Hvarfner, Carl
    Stoll, Danny
    Janowski, Maciej
    Lindauer, Marius
    Nardi, Luigi
    Hutter, Frank
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [28] Image steganalysis using active learning and hyperparameter optimization
    Li, Bohang
    Li, Ningxin
    Yang, Jing
    Alfarraj, Osama
    Albelhai, Fahad
    Tolba, Amr
    Shaikh, Zaffar Ahmed
    Alizadehsani, Roohallah
    Plawiak, Pawel
    Yee, Por Lip
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [29] Meta-learning Hyperparameter Performance Prediction with Neural Processes
    Wei, Ying
    Zhao, Peilin
    Huang, Junzhou
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [30] A Research on Automatic Hyperparameter Recommendation via Meta-Learning
    Deng, Liping
    ProQuest Dissertations and Theses Global, 2023,