Multi-label dimensionality reduction and classification with extreme learning machines

被引:1
|
作者
Lin Feng [1 ,2 ]
Jing Wang [1 ,2 ]
Shenglan Liu [1 ,2 ]
Yao Xiao [1 ,2 ]
机构
[1] Faculty of Electronic Information and Electrical Engineering, School of Computer Science and Technology,Dalian University of Technology
[2] School of Innovation Experiment, Dalian University of Technology
基金
中国国家自然科学基金;
关键词
multi-label; dimensionality reduction; kernel trick; classification;
D O I
暂无
中图分类号
TP181 [自动推理、机器学习];
学科分类号
摘要
In the need of some real applications, such as text categorization and image classification, the multi-label learning gradually becomes a hot research point in recent years. Much attention has been paid to the research of multi-label classification algorithms. Considering the fact that the high dimensionality of the multi-label datasets may cause the curse of dimensionality and will hamper the classification process, a dimensionality reduction algorithm, named multi-label kernel discriminant analysis(MLKDA), is proposed to reduce the dimensionality of multi-label datasets. MLKDA, with the kernel trick, processes the multi-label integrally and realizes the nonlinear dimensionality reduction with the idea similar with linear discriminant analysis(LDA). In the classification process of multi-label data, the extreme learning machine(ELM) is an efficient algorithm in the premise of good accuracy. MLKDA, combined with ELM, shows a good performance in multi-label learning experiments with several datasets. The experiments on both static data and data stream show that MLKDA outperforms multi-label dimensionality reduction via dependence maximization(MDDM) and multi-label linear discriminant analysis(MLDA) in cases of balanced datasets and stronger correlation between tags, and ELM is also a good choice for multi-label classification.
引用
收藏
页码:502 / 513
页数:12
相关论文
共 50 条
  • [31] Compact learning for multi-label classification
    Lv, Jiaqi
    Wu, Tianran
    Peng, Chenglun
    Liu, Yunpeng
    Xu, Ning
    Geng, Xin
    PATTERN RECOGNITION, 2021, 113
  • [32] On active learning in multi-label classification
    Brinker, K
    FROM DATA AND INFORMATION ANALYSIS TO KNOWLEDGE ENGINEERING, 2006, : 206 - 213
  • [33] Learning multi-label scene classification
    Boutell, MR
    Luo, JB
    Shen, XP
    Brown, CM
    PATTERN RECOGNITION, 2004, 37 (09) : 1757 - 1771
  • [34] To Be or not to Be, Tail Labels in Extreme Multi-label Learning
    Ge, Zhiqi
    Li, Ximing
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 555 - 564
  • [35] Learning and Explanation of Extreme Multi-label Deep Classification Models for Media Content
    Minici, Marco
    Pisani, Francesco Sergio
    Guarascio, Massimo
    De Francesco, Erika
    Lambardi, Pasquale
    FOUNDATIONS OF INTELLIGENT SYSTEMS (ISMIS 2022), 2022, 13515 : 138 - 148
  • [36] Joint graph regularized extreme learning machine for multi-label image classification
    Yang, Xingjiang
    Zhou, Yong
    Zhu, Qingxing
    Wu, Zhendong
    JOURNAL OF COMPUTATIONAL METHODS IN SCIENCES AND ENGINEERING, 2018, 18 (01) : 213 - 219
  • [37] Joint learning of multi-label classification and label correlations
    He, Zhi-Fen
    Yang, Ming
    Liu, Hui-Dong
    Ruan Jian Xue Bao/Journal of Software, 2014, 25 (09): : 1967 - 1981
  • [38] Noisy multi-label semi-supervised dimensionality reduction
    Mikalsen, Karl Oyvind
    Soguero-Ruiz, Cristina
    Bianchi, Filippo Maria
    Jenssen, Robert
    PATTERN RECOGNITION, 2019, 90 : 257 - 270
  • [39] Scalable Label Distribution Learning for Multi-Label Classification
    Zhao, Xingyu
    An, Yuexuan
    Qi, Lei
    Geng, Xin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [40] Correlation Networks for Extreme Multi-label Text Classification
    Xun, Guangxu
    Jha, Kishlay
    Sun, Jianhui
    Zhang, Aidong
    KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 1074 - 1082