Fast Approximated Multiple Kernel K-means

被引:0
|
作者
Wang J. [1 ]
Tang C. [2 ]
Zheng X. [1 ]
Liu X. [1 ]
Zhang W. [3 ]
Zhu E. [1 ]
Zhu X. [4 ]
机构
[1] School of Computer, National University of Defense Technology, Changsha
[2] School of Computer Science, China University of Geosciences, Wuhan
[3] Shandong Provincial Key Laboratory of Computer Networks, Shandong Computer Science Center (National Supercomputing Center in Jinan), Qilu University of Technology (Shandong Academy of Sciences), Jinan
[4] School of Computer Science and Technology (School of Artificial Intelligence), Zhejiang Normal University, Jinhua
基金
中国国家自然科学基金;
关键词
Clustering algorithms; Computational complexity; data fusion; Kernel; Linear programming; Matrix decomposition; Multi-view clustering; multiple kernel <inline-formula xmlns:ali="http://www.niso.org/schemas/ali/1.0/" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <tex-math notation="LaTeX">$k$</tex-math> </inline-formula>-means; Optimization; partition learning; Partitioning algorithms;
D O I
10.1109/TKDE.2023.3340743
中图分类号
学科分类号
摘要
Multiple Kernel Clustering (MKC) has emerged as a prominent research domain in recent decades due to its capacity to exploit diverse information from multiple views by learning an optimal kernel. Despite the successes achieved by various MKC methods, a significant challenge lies in the computational complexity associated with generating a consensus partition from the optimal kernel matrix, typically of size <inline-formula><tex-math notation="LaTeX">$n times n$</tex-math></inline-formula>, where <inline-formula><tex-math notation="LaTeX">$n$</tex-math></inline-formula> represents the number of samples. This computational bottleneck restricts the practical applicability of these methods when confronted with large-scale datasets. Furthermore, certain existing MKC algorithms derive the consensus partition matrix by fusing all base partitions. However, this fusion process may inadvertently overlook critical information embedded in individual base kernels, potentially leading to inferior clustering performance. In light of these challenges, we introduce an innovative and efficient multiple kernel <inline-formula><tex-math notation="LaTeX">$k$</tex-math></inline-formula>-means approach, denoted as FAMKKM. Notably, FAMKKM incorporates two approximated partition matrices instead of the original individual partition matric for each base kernel. This strategic substitution significantly reduces computational complexity. Additionally, FAMKKM leverages the original kernel information to guide the fusion of all base partitions, thereby enhancing the quality of the resulting consensus partition matrix. Finally, we substantiate the efficacy and efficiency of the proposed FAMKKM through extensive experiments conducted on six benchmark datasets. Our results demonstrate its superiority over state-of-the-art methods. The demo code of this work is publicly available at <uri>https://github.com/WangJun2023/FAMKKM</uri> IEEE
引用
收藏
页码:1 / 10
页数:9
相关论文
共 50 条
  • [1] Fusion Multiple Kernel K-means
    Zhang, Yi
    Liu, Xinwang
    Liu, Jiyuan
    Dai, Sisi
    Zhang, Changwang
    Xu, Kai
    Zhu, En
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 9109 - 9117
  • [2] Discrete Multiple Kernel k-means
    Wang, Rong
    Lu, Jitao
    Lu, Yihang
    Nie, Feiping
    Li, Xuelong
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 3111 - 3117
  • [3] Localized Simple Multiple Kernel K-means
    Liu, Xinwang
    Zhou, Sihang
    Liu, Li
    Tang, Chang
    Wang, Siwei
    Liu, Jiyuan
    Zhang, Yi
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 9273 - 9281
  • [4] Localized Incomplete Multiple Kernel k-means
    Zhu, Xinzhong
    Liu, Xinwang
    Li, Miaomiao
    Zhu, En
    Liu, Li
    Cai, Zhiping
    Yin, Jianping
    Gao, Wen
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 3271 - 3277
  • [5] SimpleMKKM: Simple Multiple Kernel K-Means
    Liu, Xinwang
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (04) : 5174 - 5186
  • [6] Multiple Kernel k-Means with Incomplete Kernels
    Liu, Xinwang
    Li, Miaomiao
    Wang, Lei
    Dou, Yong
    Yin, Jianping
    Zhu, En
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2259 - 2265
  • [7] Multiple Kernel k-Means with Incomplete Kernels
    Liu, Xinwang
    Zhu, Xinzhong
    Li, Miaomiao
    Wang, Lei
    Zhu, En
    Liu, Tongliang
    Kloft, Marius
    Shen, Dinggang
    Yin, Jianping
    Gao, Wen
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2020, 42 (05) : 1191 - 1204
  • [8] Scalable Multiple Kernel k-means Clustering
    Lu, Yihang
    Xin, Haonan
    Wang, Rong
    Nie, Feiping
    Li, Xuelong
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 4279 - 4283
  • [9] Simple multiple kernel k-means with kernel weight regularization*
    Li, Miaomiao
    Zhang, Yi
    Liu, Suyuan
    Liu, Zhe
    Zhu, Xinzhong
    INFORMATION FUSION, 2023, 100
  • [10] Regularized Simple Multiple Kernel k-Means With Kernel Average Alignment
    Li, Miaomiao
    Zhang, Yi
    Ma, Chuan
    Liu, Suyuan
    Liu, Zhe
    Yin, Jianping
    Liu, Xinwang
    Liao, Qing
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, : 1 - 10