Multi-label dimensionality reduction and classification with extreme learning machines

被引:1
|
作者
Lin Feng [1 ,2 ]
Jing Wang [1 ,2 ]
Shenglan Liu [1 ,2 ]
Yao Xiao [1 ,2 ]
机构
[1] Faculty of Electronic Information and Electrical Engineering, School of Computer Science and Technology,Dalian University of Technology
[2] School of Innovation Experiment, Dalian University of Technology
基金
中国国家自然科学基金;
关键词
multi-label; dimensionality reduction; kernel trick; classification;
D O I
暂无
中图分类号
TP181 [自动推理、机器学习];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the need of some real applications, such as text categorization and image classification, the multi-label learning gradually becomes a hot research point in recent years. Much attention has been paid to the research of multi-label classification algorithms. Considering the fact that the high dimensionality of the multi-label datasets may cause the curse of dimensionality and will hamper the classification process, a dimensionality reduction algorithm, named multi-label kernel discriminant analysis(MLKDA), is proposed to reduce the dimensionality of multi-label datasets. MLKDA, with the kernel trick, processes the multi-label integrally and realizes the nonlinear dimensionality reduction with the idea similar with linear discriminant analysis(LDA). In the classification process of multi-label data, the extreme learning machine(ELM) is an efficient algorithm in the premise of good accuracy. MLKDA, combined with ELM, shows a good performance in multi-label learning experiments with several datasets. The experiments on both static data and data stream show that MLKDA outperforms multi-label dimensionality reduction via dependence maximization(MDDM) and multi-label linear discriminant analysis(MLDA) in cases of balanced datasets and stronger correlation between tags, and ELM is also a good choice for multi-label classification.
引用
收藏
页码:502 / 513
页数:12
相关论文
共 50 条
  • [1] Multi-label dimensionality reduction and classification with extreme learning machines
    Feng, Lin
    Wang, Jing
    Liu, Shenglan
    Xiao, Yao
    JOURNAL OF SYSTEMS ENGINEERING AND ELECTRONICS, 2014, 25 (03) : 502 - 513
  • [2] Multi-Label Classification Method Based on Extreme Learning Machines
    Venkatesan, Rajasekar
    Er, Meng Joo
    2014 13TH INTERNATIONAL CONFERENCE ON CONTROL AUTOMATION ROBOTICS & VISION (ICARCV), 2014, : 619 - 624
  • [3] Linear Dimensionality Reduction for Multi-label Classification
    Ji, Shuiwang
    Ye, Jieping
    21ST INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI-09), PROCEEDINGS, 2009, : 1077 - 1082
  • [4] A Review on Dimensionality Reduction for Multi-Label Classification
    Siblini, Wissam
    Kuntz, Pascale
    Meyer, Frank
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2021, 33 (03) : 839 - 857
  • [5] An Enhanced Dimensionality Reduction for Multi-label Learning
    Shao, Yanqing
    Yan, Kai
    2015 INTERNATIONAL CONFERENCE ON COMPUTERS, COMMUNICATIONS, AND SYSTEMS (ICCCS), 2015, : 163 - 170
  • [6] Multi-label classification using a cascade of stacked autoencoder and extreme learning machines
    Law, Anwesha
    Ghosh, Ashish
    NEUROCOMPUTING, 2019, 358 : 222 - 234
  • [7] Multi-Label Classification with Extreme Learning Machine
    Kongsorot, Yanika
    Horata, Punyaphol
    2014 6TH INTERNATIONAL CONFERENCE ON KNOWLEDGE AND SMART TECHNOLOGY (KST), 2014, : 81 - 86
  • [8] Extreme Learning Machine for Multi-Label Classification
    Sun, Xia
    Xu, Jingting
    Jiang, Changmeng
    Feng, Jun
    Chen, Su-Shing
    He, Feijuan
    ENTROPY, 2016, 18 (06)
  • [9] DiSMEC - Distributed Sparse Machines for Extreme Multi-label Classification
    Babbar, Rohit
    Schoelkopf, Bernhard
    WSDM'17: PROCEEDINGS OF THE TENTH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2017, : 721 - 729
  • [10] Deep Learning for Extreme Multi-label Text Classification
    Liu, Jingzhou
    Chang, Wei-Cheng
    Wu, Yuexin
    Yang, Yiming
    SIGIR'17: PROCEEDINGS OF THE 40TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2017, : 115 - 124