An improved random forest based on the classification accuracy and correlation measurement of decision trees

被引:105
|
作者
Sun, Zhigang [1 ,2 ,4 ,5 ]
Wang, Guotao [1 ,2 ,4 ,5 ]
Li, Pengfei [2 ,4 ,5 ]
Wang, Hui [3 ]
Zhang, Min [1 ]
Liang, Xiaowen [1 ]
机构
[1] Heilongjiang Univ, Sch Elect & Elect Engn, Harbin 150080, Peoples R China
[2] Harbin Inst Technol, Sch Elect Engn & Automat, Harbin 150001, Peoples R China
[3] Yangzhou Univ, Sch Hydraul Sci & Engn, Yangzhou 225009, Peoples R China
[4] Key Lab Elect & Elect Reliabil Technol Heilongjian, Harbin 150001, Peoples R China
[5] MOE Key Lab Reliabil & Qual Consistency Elect Comp, Harbin 150001, Peoples R China
关键词
Classification accuracy; Correlation measurement; Dot product; Random forest; CART; DISTANCE;
D O I
10.1016/j.eswa.2023.121549
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Random forest is one of the most widely used machine learning algorithms. Decision trees used to construct the random forest may have low classification accuracies or high correlations, which affects the comprehensive performance of the random forest. Aiming at these problems, the authors proposed an improved random forest based on the classification accuracy and correlation measurement of decision trees in this paper. Its main idea includes two parts, one is retaining the classification and regression trees (CARTs) with better classification effects, the other is reducing the correlations between the CARTs. Specifically, in the classification effect evaluation part, each CART was applied to make predictions on three reserved data sets, then the average classifi-cation accuracies were achieved, respectively. Thus, all the CARTs were sorted in descending order according to their achieved average classification accuracies. In the correlation measurement part, the improved dot product method was proposed to calculate the cosine similarity, i.e., the correlation, between CARTs in the feature space. By using the achieved average classification accuracy as reference, the grid search method was used to find the inner product threshold. On this basis, the CARTs with low average classification accuracy among CART pairs whose inner product values are higher than the inner product threshold were marked as deletable. The achieved average classification accuracies and correlations of CARTs were comprehensively considered, those with high correlation and weak classification effect were deleted, and those with better quality were retained to construct the random forest. Multiple experiments show that, the proposed improved random forest achieved higher average classification accuracy than the five random forests used for comparison, and the lead was stable. The G-means and out-of-bag data (OBD) score obtained by the proposed improved random forest were also higher than the five random forests, and the lead was more obvious. In addition, the test results of three non-parametric tests show that, there were significant diversities between the proposed improved random forest and the other five random forests. This effectively proves the superiority and practicability of the proposed improved random forest.
引用
收藏
页数:19
相关论文
共 50 条
  • [41] Decision Trees Based on the Computational Verb Correlation
    Huang, Renyu
    Li, Bin
    Yang, Tao
    2013 IEEE INTERNATIONAL CONFERENCE ON ANTI-COUNTERFEITING, SECURITY AND IDENTIFICATION (ASID), 2013,
  • [42] Development of Hypergraph Based Improved Random Forest Algorithm for Partial Discharge Pattern Classification
    Govindarajan, Suganya
    Ardila-Rey, Jorge Alfredo
    Krithivasan, Kannan
    Subbaiah, Jayalalitha
    Sannidhi, Nikhith
    Balasubramanian, M.
    IEEE ACCESS, 2021, 9 : 96 - 109
  • [43] Integrated Pedestrian and Direction Classification using a Random Decision Forest
    Tao, Junli
    Klette, Reinhard
    2013 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW), 2013, : 230 - 237
  • [44] Point Clouds Classification Algorithm Based on Cloth Filtering Algorithm and Improved Random Forest
    Xue Doudou
    Cheng Yinglei
    Shi Xiaosong
    Qin Xianxiang
    Wen Pei
    LASER & OPTOELECTRONICS PROGRESS, 2020, 57 (22)
  • [45] RF-NR: Random Forest Based Approach for Improved Classification of Nuclear Receptors
    Ismail, Hamid D.
    Saigo, Hiroto
    Dukka, B. K. C.
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2018, 15 (06) : 1844 - 1852
  • [46] A bitcoin service community classification method based on Random Forest and improved KNN algorithm
    Gao M.
    Lin S.
    Tian X.
    He X.
    He K.
    Chen S.
    IET Blockchain, 2024, 4 (03): : 276 - 286
  • [47] Accuracy comparison of various remote sensing data in lithological classification based on random forest algorithm
    Xi, Yantao
    Mohamed Taha, Abdallah M.
    Hu, Anqi
    Liu, Xianbin
    GEOCARTO INTERNATIONAL, 2022, 37 (26) : 14451 - 14479
  • [48] Robustification of the Random Forest: A Multitude of Decision Trees for Fault Diagnosis of Face Milling Cutter Through Measurement of Spindle Vibrations
    Atharva A. Jogdeo
    Abhishek D. Patange
    Atharva M. Atnurkar
    Pradnya R. Sonar
    Journal of Vibration Engineering & Technologies, 2024, 12 : 4521 - 4539
  • [49] Robustification of the Random Forest: A Multitude of Decision Trees for Fault Diagnosis of Face Milling Cutter Through Measurement of Spindle Vibrations
    Jogdeo, Atharva A.
    Patange, Abhishek D.
    Atnurkar, Atharva M.
    Sonar, Pradnya R.
    JOURNAL OF VIBRATION ENGINEERING & TECHNOLOGIES, 2024, 12 (03) : 4521 - 4539
  • [50] An Improved Random Forest Algorithm for classification in an imbalanced dataset.
    Jose, Christy
    Gopakumar, G.
    2019 URSI ASIA-PACIFIC RADIO SCIENCE CONFERENCE (AP-RASC), 2019,