Enhancing Collaborative and Geometric Multi-Kernel Learning Using Deep Neural Network

被引:3
|
作者
Zafar, Bareera [1 ]
Naqvi, Syed Abbas Zilqurnain [1 ]
Ahsan, Muhammad [1 ]
Ditta, Allah [2 ]
Baneen, Ummul [1 ]
Khan, Muhammad Adnan [3 ,4 ]
机构
[1] Univ Engn & Technol, Dept Mechatron & Control Engn, Lahore 54890, Pakistan
[2] Univ Educ, Div Sci & Technol, Dept Informat Sci, Lahore 54000, Pakistan
[3] Riphah Int Univ, Fac Comp, Riphah Sch Comp & Innovat, Lahore Campus, Lahore 54000, Pakistan
[4] Gachon Univ, Dept Software, Seongnam 13120, South Korea
来源
CMC-COMPUTERS MATERIALS & CONTINUA | 2022年 / 72卷 / 03期
关键词
CGMKL; multi-class classification; deep neural network; multiple kernel learning; hierarchical kernel spaces; KERNEL; CLASSIFICATION; SELECTION;
D O I
10.32604/cmc.2022.027874
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This research proposes a method called enhanced collaborative and geometric multi-kernel learning (E-CGMKL) that can enhance the CGMKL algorithm which deals with multi-class classification problems with non-linear data distributions. CGMKL combines multiple kernel learning with softmax function using the framework of multi empirical kernel learning (MEKL) in which empirical kernel mapping (EKM) provides explicit feature construction in the high dimensional kernel space. CGMKL ensures the consistent output of samples across kernel spaces and minimizes the within-class distance to highlight geometric features of multiple classes. However, the kernels constructed by CGMKL do not have any explicit relationship among them and try to construct high dimensional feature representations independently from each other. This could be disadvantageous for learning on datasets with complex hidden structures. To overcome this limitation, E-CGMKL constructs kernel spaces from hidden layers of trained deep neural networks (DNN). Due to the nature of the DNN architecture, these kernel spaces not only provide multiple feature representations but also inherit the compositional hierarchy of the hidden layers, which might be beneficial for enhancing the predictive performance of the CGMKL algorithm on complex data with natural hierarchical structures, for example, image data. Furthermore, our proposed scheme handles image data by constructing kernel spaces from a convolutional neural network (CNN). Considering the effectiveness of CNN architecture on image data, these kernel spaces provide a major advantage over the CGMKL algorithm which does not exploit the CNN architecture for constructing kernel spaces from image data. Additionally, outputs of hidden layers directly provide features for kernel spaces and unlike CGMKL, do not require an approximate MEKL framework. E-CGMKL combines the consistency and geometry preserving aspects of CGMKL with the compositional hierarchy of kernel spaces extracted from DNN hidden layers to enhance the predictive performance of CGMKL significantly. The experimental results on various data sets demonstrate the superior performance of the E-CGMKL algorithm compared to other competing methods including the benchmark CGMKL.
引用
收藏
页码:5099 / 5116
页数:18
相关论文
共 50 条
  • [31] Enhancing deep neural networks via multiple kernel learning
    Lauriola, Ivano
    Gallicchio, Claudio
    Aiolli, Fabio
    [J]. PATTERN RECOGNITION, 2020, 101
  • [32] Gossiped and Quantized Online Multi-Kernel Learning
    Ortega, Tomas
    Jafarkhani, Hamid
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2023, 30 : 468 - 472
  • [33] Expanding Design Spaces in Digital Composite Materials: A Multi-Input Deep Learning Approach Enhanced by Transfer Learning and Multi-kernel Network
    Park, Donggeun
    Park, Minwoo
    Ryu, Seunghwa
    [J]. ADVANCED THEORY AND SIMULATIONS, 2023, 6 (11)
  • [34] Multi-kernel partial label learning using graph contrast disambiguation
    Li, Hongyan
    Wan, Zhonglin
    Vong, Chi Man
    [J]. APPLIED INTELLIGENCE, 2024, 54 (20) : 9760 - 9782
  • [35] The Optimal Solution of Multi-kernel Regularization Learning
    Hong Wei SUN
    Ping LIU
    [J]. ActaMathematicaSinica, 2013, 29 (08) : 1607 - 1616
  • [36] Learning rates of multi-kernel regularized regression
    Chen, Hong
    Li, Luoqing
    [J]. JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2010, 140 (09) : 2562 - 2568
  • [37] Stock Volatility Prediction using Multi-Kernel Learning based Extreme Learning Machine
    Wang, Feng
    Zhao, Zhiyong
    Li, Xiaodong
    Yu, Fei
    Zhang, Hao
    [J]. PROCEEDINGS OF THE 2014 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2014, : 3078 - 3085
  • [38] Multi-Kernel Excitation Network for Video Action Recognition
    Tian, Qingze
    Wang, Kun
    Liu, Baodi
    Wang, Yanjiang
    [J]. 2022 16TH IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP2022), VOL 1, 2022, : 155 - 159
  • [39] Learning rates for multi-kernel linear programming classifiers
    Cao, Feilong
    Xing, Xing
    [J]. FRONTIERS OF MATHEMATICS IN CHINA, 2011, 6 (02) : 203 - 219
  • [40] Learning with multi-kernel Growing Support Vector Classifiers
    Zhou Jian-guo
    Wang Xiao-wei
    [J]. ISDA 2006: SIXTH INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS DESIGN AND APPLICATIONS, VOL 1, 2006, : 188 - 194