Neural Generalization of Multiple Kernel Learning

被引:2
|
作者
Ghanizadeh, Ahmad Navid [1 ]
Ghiasi-Shirazi, Kamaledin [2 ]
Monsefi, Reza [2 ]
Qaraei, Mohammadreza [3 ]
机构
[1] Saarland Univ, Dept Comp Sci, Saarbrucken, Germany
[2] Ferdowsi Univ Mashhad, Dept Comp Engn, Mashhad, Iran
[3] Aalto Univ, Dept Comp Sci, Helsinki, Finland
关键词
Multiple Kernel learning; MKL; Deep learning; Kernel methods; Neural networks; CLASSIFICATION;
D O I
10.1007/s11063-024-11516-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multiple Kernel Learning (MKL) is a conventional way to learn the kernel function in kernel-based methods. MKL algorithms enhance the performance of kernel methods. However, these methods have a lower complexity compared to deep models and are inferior to them regarding recognition accuracy. Deep learning models can learn complex functions by applying nonlinear transformations to data through several layers. In this paper, we show that a typical MKL algorithm can be interpreted as a one-layer neural network with linear activation functions. By this interpretation, we propose a Neural Generalization of Multiple Kernel Learning (NGMKL), which extends the conventional MKL framework to a multi-layer neural network with nonlinear activation functions. Our experiments show that the proposed method, which has a higher complexity than traditional MKL methods, leads to higher recognition accuracy on several benchmarks.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Neural Generalization of Multiple Kernel Learning
    Ahmad Navid Ghanizadeh
    Kamaledin Ghiasi-Shirazi
    Reza Monsefi
    Mohammadreza Qaraei
    Neural Processing Letters, 56
  • [2] Generalization Bounds for Coregularized Multiple Kernel Learning
    Wu, Xinxing
    Hu, Guosheng
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2018, 2018
  • [3] Neural Tangent Kernel: Convergence and Generalization in Neural Networks
    Jacot, Arthur
    Gabriel, Franck
    Hongler, Clement
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [4] Stability and Generalization of Kernel Clustering: From Single Kernel to Multiple Kernel
    Liang, Weixuan
    Liu, Xinwang
    Liu, Yong
    Zhou, Sihang
    Huang, Jun-Jie
    Wang, Siwei
    Liu, Jiyuan
    Zhang, Yi
    Zhu, En
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [5] Enhancing deep neural networks via multiple kernel learning
    Lauriola, Ivano
    Gallicchio, Claudio
    Aiolli, Fabio
    PATTERN RECOGNITION, 2020, 101
  • [6] Infinite Kernel Learning: Generalization Bounds and Algorithms
    Liu, Yong
    Liao, Shizhong
    Lin, Hailun
    Yue, Yinliang
    Wang, Weiping
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2280 - 2286
  • [7] Convolutional spectral kernel learning with generalization guarantees
    Li, Jian
    Liu, Yong
    Wang, Weiping
    ARTIFICIAL INTELLIGENCE, 2022, 313
  • [8] Neural Tangent Kernel: Convergence and Generalization in Neural Networks (Invited Paper)
    Jacot, Arthur
    Gabriel, Franck
    Hongler, Clement
    STOC '21: PROCEEDINGS OF THE 53RD ANNUAL ACM SIGACT SYMPOSIUM ON THEORY OF COMPUTING, 2021, : 6 - 6
  • [9] Do Kernel and Neural Embeddings Help in Training and Generalization?
    Rahbar, Arman
    Jorge, Emilio
    Dubhashi, Devdatt
    Chehreghani, Morteza Haghir
    NEURAL PROCESSING LETTERS, 2023, 55 (02) : 1681 - 1695
  • [10] Do Kernel and Neural Embeddings Help in Training and Generalization?
    Arman Rahbar
    Emilio Jorge
    Devdatt Dubhashi
    Morteza Haghir Chehreghani
    Neural Processing Letters, 2023, 55 : 1681 - 1695