KGTuner: Efficient Hyper-parameter Search for Knowledge Graph Learning

被引:0
|
作者
Zhang, Yongqi [1 ]
Zhou, Zhanke [1 ,2 ]
Yao, Quanming [3 ]
Li, Yong [3 ]
机构
[1] 4Paradigm Inc, Beijing, Peoples R China
[2] Hong Kong Baptist Univ, Hong Kong, Peoples R China
[3] Tsinghua Univ, Dept Elect Engn, Beijing, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
While hyper-parameters (HPs) are important for knowledge graph (KG) learning, existing methods fail to search them efficiently. To solve this problem, we first analyze the properties of different HPs and measure the transfer ability from small subgraph to the full graph. Based on the analysis, we propose an efficient two-stage search algorithm KGTuner, which efficiently explores HP configurations on small subgraph at the first stage and transfers the top-performed configurations for fine-tuning on the large full graph at the second stage. Experiments show that our method can consistently find better HPs than the baseline algorithms within the same time budget, which achieves 9.1% average relative improvement for four embedding models on the large-scale KGs in open graph benchmark.
引用
收藏
页码:2715 / 2735
页数:21
相关论文
共 50 条
  • [1] An efficient hyper-parameter optimization method for supervised learning
    Shi, Ying
    Qi, Hui
    Qi, Xiaobo
    Mu, Xiaofang
    APPLIED SOFT COMPUTING, 2022, 126
  • [2] Random search for hyper-parameter optimization
    Département D'Informatique et de Recherche Opérationnelle, Université de Montréal, Montréal, QC, H3C 3J7, Canada
    J. Mach. Learn. Res., (281-305):
  • [3] Random Search for Hyper-Parameter Optimization
    Bergstra, James
    Bengio, Yoshua
    JOURNAL OF MACHINE LEARNING RESEARCH, 2012, 13 : 281 - 305
  • [4] Hyper-Parameter Tuning for Graph Kernels via Multiple Kernel Learning
    Massimo, Carlo M.
    Navarin, Nicolo
    Sperduti, Alessandro
    NEURAL INFORMATION PROCESSING, ICONIP 2016, PT II, 2016, 9948 : 214 - 223
  • [5] Federated learning with hyper-parameter optimization
    Kundroo, Majid
    Kim, Taehong
    JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2023, 35 (09)
  • [6] Efficient Hyper-parameter Optimization with Cubic Regularization
    Shen, Zhenqian
    Yang, Hansi
    Li, Yong
    Kwok, James
    Yao, Quanming
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [7] A study on depth classification of defects by machine learning based on hyper-parameter search
    Chen, Haoze
    Zhang, Zhijie
    Yin, Wuliang
    Zhao, Chenyang
    Wang, Fengxiang
    Li, Yanfeng
    MEASUREMENT, 2022, 189
  • [8] Effectiveness of Random Search in SVM hyper-parameter tuning
    Mantovani, Rafael G.
    Rossi, Andre L. D.
    Vanschoren, Joaquin
    Bischl, Bernd
    de Carvalho, Andre C. P. L. F.
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [9] Efficient Federated Learning with Adaptive Client-Side Hyper-Parameter Optimization
    Kundroo, Majid
    Kim, Taehong
    2023 IEEE 43RD INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS, ICDCS, 2023, : 973 - 974
  • [10] FastTuning: Enabling Fast and Efficient Hyper-Parameter Tuning With Partitioning and Parallelism of Search Space
    Li, Xiaqing
    Guo, Qi
    Zhang, Guangyan
    Ye, Siwei
    He, Guanhua
    Yao, Yiheng
    Zhang, Rui
    Hao, Yifan
    Du, Zidong
    Zheng, Weimin
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2024, 35 (07) : 1174 - 1188