A Positive Region-based Dimensionality Reduction from High Dimensional data

被引:0
|
作者
Dai Zhe [1 ]
Liu Jianhui [1 ]
机构
[1] Liaoning Tech Univ, Elect & Informat Sch, LNTU, Huludao, Peoples R China
关键词
Dimensionality reduction; Attribute reduction; Classification accuracy; Rough set; ATTRIBUTE REDUCTION; DECISION SYSTEMS; ROUGH;
D O I
暂无
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Dimensionality reduction is an important attribute process work. Dimensionality reduction, i.e, attribute reduction is to delete some uncesserary attributes at rough sets. At present, many attribute reduction methods have provided to delete some superfluous and irrelevant attributes from large-scale complete data sets. The main drawback of most attribute reduction algorithms is that they can not remove some examples in the process of dimesionality reduction, which degrades a computational efficiency of attribute reduction. To overcome this drawback, an improved attribute reduction algorithm for complete data sets is proposed. In addition, the classification performance of attribute reduction is optimized. At first, the compact decision system is presented to delete some repeated objects. Then the significance measure of attributes is provided for candidate attributes. In addition, the novel approach of attribute reduction under the proposed significance measure of attributes was developed. In order to verify the efficiency of our given algorithm, the experiments on UCI datasets are performed by comparing with other attribute reduction algorithms. The results on the experiments tell us that our given algorithm obtains promising improvement for selecting an attribute reduct.
引用
收藏
页码:624 / 628
页数:5
相关论文
共 50 条
  • [21] A hybrid dimensionality reduction method for outlier detection in high-dimensional data
    Meng, Guanglei
    Wang, Biao
    Wu, Yanming
    Zhou, Mingzhe
    Meng, Tiankuo
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (11) : 3705 - 3718
  • [22] Dependence maps, a dimensionality reduction with dependence distance for high-dimensional data
    Kichun Lee
    Alexander Gray
    Heeyoung Kim
    Data Mining and Knowledge Discovery, 2013, 26 : 512 - 532
  • [23] Three-dimensional region-based filters for noise removal in volumetric data
    Lopes, RD
    Rangayyan, RM
    IETE JOURNAL OF RESEARCH, 2002, 48 (3-4) : 325 - 332
  • [24] Proposing a Dimensionality Reduction Technique With an Inequality for Unsupervised Learning from High-Dimensional Big Data
    Ismkhan, Hassan
    Izadi, Mohammad
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2023, 53 (06): : 3880 - 3889
  • [25] Dimensionality Reduction Algorithms on High Dimensional Datasets
    Syarif, Iwan
    EMITTER-INTERNATIONAL JOURNAL OF ENGINEERING TECHNOLOGY, 2014, 2 (02) : 28 - 38
  • [26] Semi-Supervised Dimensionality Reduction based on Partial Least Squares for Visual Analysis of High Dimensional Data
    Paiva, Jose Gustavo S.
    Schwartz, William Robson
    Pedrini, Helio
    Minghim, Rosane
    COMPUTER GRAPHICS FORUM, 2012, 31 (03) : 1345 - 1354
  • [27] Variational Autoencoder-Based Dimensionality Reduction for High-Dimensional Small-Sample Data Classification
    Mahmud, Mohammad Sultan
    Huang, Joshua Zhexue
    Fu, Xianghua
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS, 2020, 19 (01)
  • [28] High Dimensional Correspondences from Low Dimensional Manifolds - An Empirical Comparison of Graph-Based Dimensionality Reduction Algorithms
    Roscher, Ribana
    Schindler, Falko
    Foerstner, Wolfgang
    COMPUTER VISION - ACCV 2010 WORKSHOPS, PT II, 2011, 6469 : 334 - 343
  • [29] EFFICIENT UPDATING REDUCTION BASED ON POSITIVE REGION FOR DISTRIBUTED DATA
    Jing, Yunge
    Liang, Kunyu
    Gong, Zhiwei
    Cheng, Ni
    Wang, Baoli
    JOURNAL OF NONLINEAR AND CONVEX ANALYSIS, 2023, 24 (06) : 1327 - 1339
  • [30] Manifold learning: Dimensionality reduction and high dimensional data reconstruction via dictionary learning
    Zhao, Zhong
    Feng, Guocan
    Zhu, Jiehua
    Shen, Qi
    NEUROCOMPUTING, 2016, 216 : 268 - 285