An efficient class-dependent learning label approach using feature selection to improve multi-label classification algorithms

被引:1
|
作者
Zhao, Hao [1 ]
Li, Panpan [1 ]
机构
[1] Fujian Agr & Forestry Univ, Jinshan Coll, Fuzhou 350002, Fujian, Peoples R China
关键词
Feature selection; Normalization unsupervised-learning; Advanced hybrid MLC; Reduction in dimensions;
D O I
10.1007/s10586-024-04756-1
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, multi-label categorization learning has emerged as a new area of study in machine learning since it offers a multi-dimensional perspective of the multi-dimensional object. To build a multi-label classification (MLC) system that is both quick and efficient, one must consider the large data environment. This paper introduces a new MLC model, RR-IPLST (Ridge Regression-Principal Label Space Transformation), which uses a combination of ridge regression and original label space transformation (IPLST). The model includes Advanced Orthogonal Projection to Latent Structures (AOPLS) for optimal dimensionality reduction, increasing data set relevance. IPLST uses Singular Value Decomposition (SVD) to identify label correlations, while RR mitigates multicollinearity problems. Unsupervised learning, especially the value learning method, refines label predictions. The proposed model is rigorously evaluated on ten diverse multi-label datasets and demonstrates its effectiveness. Support vector machine, simple Bayes polynomial, k nearest neighbor, decision trees, linear and discriminant analysis are five regularly used data set classification techniques whose outputs are compared. The simulation results showed that the application of the proposed solution is of higher quality compared to similar methods in terms of accuracy, precision, convergence, error measurement, stability and other tested parameters in different data sets. So that the evaluation and test results of the proposed method show a 92.21% improvement compared to other existing algorithms.
引用
收藏
页数:25
相关论文
共 50 条
  • [41] A lightweight filter based feature selection approach for multi-label text classification
    Dhal P.
    Azad C.
    Journal of Ambient Intelligence and Humanized Computing, 2023, 14 (09) : 12345 - 12357
  • [42] Multi-label feature selection considering label supplementation
    Zhang, Ping
    Liu, Guixia
    Gao, Wanfu
    Song, Jiazhi
    PATTERN RECOGNITION, 2021, 120 (120)
  • [43] Independent Feature and Label Components for Multi-label Classification
    Zhong, Yongjian
    Xu, Chang
    Du, Bo
    Zhang, Lefei
    2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, : 827 - 836
  • [44] Multi-label feature selection via label relaxation
    Fan, Yuling
    Liu, Peizhong
    Liu, Jinghua
    APPLIED SOFT COMPUTING, 2025, 175
  • [45] A COPRAS-based Approach to Multi-Label Feature Selection for Text Classification
    Mohanrasu, S. S.
    Janani, K.
    Rakkiyappan, R.
    MATHEMATICS AND COMPUTERS IN SIMULATION, 2024, 222 : 3 - 23
  • [46] Feature Selection for Multi-label Learning Using Mutual Information and GA
    Yu, Ying
    Wang, Yinglong
    ROUGH SETS AND KNOWLEDGE TECHNOLOGY, RSKT 2014, 2014, 8818 : 454 - 463
  • [47] Multi-label Feature selection with adaptive graph learning and label information enhancement
    Qin, Zhi
    Chen, Hongmei
    Mi, Yong
    Luo, Chuan
    Horng, Shi-Jinn
    Li, Tianrui
    KNOWLEDGE-BASED SYSTEMS, 2024, 285
  • [48] Robust Label and Feature Space Co-Learning for Multi-Label Classification
    Liu, Zhifeng
    Tang, Chuanjing
    Abhadiomhen, Stanley Ebhohimhen
    Shen, Xiang-Jun
    Li, Yangyang
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (11) : 11846 - 11859
  • [49] Dual Perspective of Label-Specific Feature Learning for Multi-Label Classification
    Hang, Jun-Yi
    Zhang, Min-Ling
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [50] Dual Perspective of Label-Specific Feature Learning for Multi-Label Classification
    Hang, Jun-Yi
    Zhang, Min-Ling
    ACM Transactions on Knowledge Discovery from Data, 2024, 19 (01)