Multi-label dimensionality reduction and classification with extreme learning machines

被引:1
|
作者
Lin Feng [1 ,2 ]
Jing Wang [1 ,2 ]
Shenglan Liu [1 ,2 ]
Yao Xiao [1 ,2 ]
机构
[1] Faculty of Electronic Information and Electrical Engineering, School of Computer Science and Technology,Dalian University of Technology
[2] School of Innovation Experiment, Dalian University of Technology
基金
中国国家自然科学基金;
关键词
multi-label; dimensionality reduction; kernel trick; classification;
D O I
暂无
中图分类号
TP181 [自动推理、机器学习];
学科分类号
摘要
In the need of some real applications, such as text categorization and image classification, the multi-label learning gradually becomes a hot research point in recent years. Much attention has been paid to the research of multi-label classification algorithms. Considering the fact that the high dimensionality of the multi-label datasets may cause the curse of dimensionality and will hamper the classification process, a dimensionality reduction algorithm, named multi-label kernel discriminant analysis(MLKDA), is proposed to reduce the dimensionality of multi-label datasets. MLKDA, with the kernel trick, processes the multi-label integrally and realizes the nonlinear dimensionality reduction with the idea similar with linear discriminant analysis(LDA). In the classification process of multi-label data, the extreme learning machine(ELM) is an efficient algorithm in the premise of good accuracy. MLKDA, combined with ELM, shows a good performance in multi-label learning experiments with several datasets. The experiments on both static data and data stream show that MLKDA outperforms multi-label dimensionality reduction via dependence maximization(MDDM) and multi-label linear discriminant analysis(MLDA) in cases of balanced datasets and stronger correlation between tags, and ELM is also a good choice for multi-label classification.
引用
收藏
页码:502 / 513
页数:12
相关论文
共 50 条
  • [21] Deep Learning Method with Attention for Extreme Multi-label Text Classification
    Chen, Si
    Wang, Liangguo
    Li, Wan
    Zhang, Kun
    PRICAI 2019: TRENDS IN ARTIFICIAL INTELLIGENCE, PT III, 2019, 11672 : 179 - 190
  • [22] Learning shared subspace for multi-label dimensionality reduction via dependence maximization
    Shu, Xin
    Lai, Darong
    Xu, Huanliang
    Tao, Liang
    NEUROCOMPUTING, 2015, 168 : 356 - 364
  • [23] Extreme Multi-label Classification for Information Retrieval
    Dembczynski, Krzysztof
    Babbar, Rohit
    ADVANCES IN INFORMATION RETRIEVAL (ECIR 2018), 2018, 10772 : 839 - 840
  • [24] Evaluating Extreme Hierarchical Multi-label Classification
    Amigo, Enrique
    Delgado, Agustin D.
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 5809 - 5819
  • [25] Reweighting Forest for Extreme Multi-label Classification
    Lin, Zhun-Zheng
    Dai, Bi-Ru
    BIG DATA ANALYTICS AND KNOWLEDGE DISCOVERY, DAWAK 2017, 2017, 10440 : 286 - 299
  • [26] Prototype Selection and Dimensionality Reduction on Multi-Label Data
    Hemavati
    Devi, V. Susheela
    Kuruvilla, Seba Ann
    Aparna, R.
    PROCEEDINGS OF THE 7TH ACM IKDD CODS AND 25TH COMAD (CODS-COMAD 2020), 2020, : 195 - 199
  • [27] Semi-Supervised Multi-Label Dimensionality Reduction
    Guo, Baolin
    Hou, Chenping
    Nie, Feiping
    Yi, Dongyun
    2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 919 - 924
  • [28] Combining Dimensionality Reduction with Random Forests for Multi-label Classification Under Interactivity Constraints
    Nair-Benrekia, Noureddine-Yassine
    Kuntz, Pascale
    Meyer, Frank
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2017, PT II, 2017, 10235 : 828 - 839
  • [29] Metric Learning for Multi-label Classification
    Brighi, Marco
    Franco, Annalisa
    Maio, Dario
    STRUCTURAL, SYNTACTIC, AND STATISTICAL PATTERN RECOGNITION, S+SSPR 2020, 2021, 12644 : 24 - 33
  • [30] Hyperspherical Learning in Multi-Label Classification
    Ke, Bo
    Zhu, Yunquan
    Li, Mengtian
    Shu, Xiujun
    Qiao, Ruizhi
    Ren, Bo
    COMPUTER VISION, ECCV 2022, PT XXV, 2022, 13685 : 38 - 55