Deep multilayer multiple kernel learning

被引:0
|
作者
Ilyes Rebai
Yassine BenAyed
Walid Mahdi
机构
[1] University of Sfax,MIRACL: Multimedia InfoRmation System and Advanced Computing Laboratory
[2] Taif University,College of Computers and Information Technology
来源
关键词
Deep learning; Support vector machine; Multilayer multiple kernel learning; Optimization methods; Gradient ascent;
D O I
暂无
中图分类号
学科分类号
摘要
Multiple kernel learning (MKL) approach has been proposed for kernel methods and has shown high performance for solving some real-world applications. It consists on learning the optimal kernel from one layer of multiple predefined kernels. Unfortunately, this approach is not rich enough to solve relatively complex problems. With the emergence and the success of the deep learning concept, multilayer of multiple kernel learning (MLMKL) methods were inspired by the idea of deep architecture. They are introduced in order to improve the conventional MKL methods. Such architectures tend to learn deep kernel machines by exploring the combinations of multiple kernels in a multilayer structure. However, existing MLMKL methods often have trouble with the optimization of the network for two or more layers. Additionally, they do not always outperform the simplest method of combining multiple kernels (i.e., MKL). In order to improve the effectiveness of MKL approaches, we introduce, in this paper, a novel backpropagation MLMKL framework. Specifically, we propose to optimize the network over an adaptive backpropagation algorithm. We use the gradient ascent method instead of dual objective function, or the estimation of the leave-one-out error. We test our proposed method through a large set of experiments on a variety of benchmark data sets. We have successfully optimized the system over many layers. Empirical results over an extensive set of experiments show that our algorithm achieves high performance compared to the traditional MKL approach and existing MLMKL methods.
引用
收藏
页码:2305 / 2314
页数:9
相关论文
共 50 条
  • [1] Deep multilayer multiple kernel learning
    Rebai, Ilyes
    BenAyed, Yassine
    Mahdi, Walid
    NEURAL COMPUTING & APPLICATIONS, 2016, 27 (08): : 2305 - 2314
  • [2] Multilayer deep features with multiple kernel learning for action recognition
    Sheng, Biyun
    Li, Jun
    Xiao, Fu
    Yang, Wankou
    NEUROCOMPUTING, 2020, 399 : 65 - 74
  • [3] Deep multiple multilayer kernel learning in core vector machines
    Afzal, A. L.
    Asharaf, S.
    EXPERT SYSTEMS WITH APPLICATIONS, 2018, 96 : 149 - 156
  • [4] Deep Multiple Kernel Learning
    Strobl, Eric V.
    Visweswaran, Shyam
    2013 12TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA 2013), VOL 1, 2013, : 414 - 417
  • [5] DEEP KERNEL LEARNING NETWORKS WITH MULTIPLE LEARNING PATHS
    Xu, Ping
    Wang, Yue
    Chen, Xiang
    Tian, Zhi
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4438 - 4442
  • [6] A DEEP LEARNING APPROACH TO MULTIPLE KERNEL FUSION
    Song, Huan
    Thiagarajan, Jayaraman J.
    Sattigeri, Prasanna
    Ramamurthy, Karthikeyan Natesan
    Spanias, Andreas
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 2292 - 2296
  • [7] Bridging deep and multiple kernel learning: A review
    Wang, Tinghua
    Zhang, Lin
    Hu, Wenyu
    INFORMATION FUSION, 2021, 67 : 3 - 13
  • [8] Deep Multiple Kernel Learning for Prediction of MicroRNA Precursors
    Shi, Hengyue
    Wang, Dong
    Wu, Peng
    Cao, Yi
    Chen, Yuehui
    SCIENTIFIC PROGRAMMING, 2021, 2021
  • [9] An enhancing multiple kernel extreme learning machine based on deep learning
    Zhang, Meng
    Sun, Rongkang
    Cui, Tong
    Ren, Yan
    39TH YOUTH ACADEMIC ANNUAL CONFERENCE OF CHINESE ASSOCIATION OF AUTOMATION, YAC 2024, 2024, : 1340 - 1345
  • [10] Distributionally Robust Optimization for Deep Kernel Multiple Instance Learning
    Sapkota, Hitesh
    Ying, Yiming
    Chen, Feng
    Yu, Qi
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130