Feature selection for software effort estimation with localized neighborhood mutual information

被引:10
|
作者
Liu, Qin [1 ]
Xiao, Jiakai [2 ]
Zhu, Hongming [1 ]
机构
[1] Tongji Univ, Sch Software Engn, Shanghai 201804, Peoples R China
[2] Tongji Univ, Dept Comp Sci & Technol, Shanghai 201804, Peoples R China
关键词
Feature selection; Case based reasoning; Neighborhood mutual information; Software effort estimation;
D O I
10.1007/s10586-018-1884-x
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Feature selection is usually employed before applying case based reasoning (CBR) for Software Effort Estimation (SEE). Unfortunately, most feature selection methods treat CBR as a black box method so there is no guarantee on the appropriateness of CBR on selected feature subset. The key to solve the problem is to measure the appropriateness of CBR assumption for a given feature set. In this paper, a measure called localized neighborhood mutual information (LNI) is proposed for this purpose and a greedy method called LNI based feature selection (LFS) is designed for feature selection. Experiment with leave-one-out cross validation (LOOCV) on 6 benchmark datasets demonstrates that: (1) CBR makes effective estimation with the LFS selected subset compared with a randomized baseline method. Compared with three representative feature selection methods, (2) LFS achieves optimal MAR value on 3 out of 6 datasets with a 14% average improvement and (3) LFS achieves optimal MMRE on 5 out of 6 datasets with a 24% average improvement.
引用
收藏
页码:S6953 / S6961
页数:9
相关论文
共 50 条
  • [31] Feature Selection and Discretization based on Mutual Information
    Sharmin, Sadia
    Ali, Amin Ahsan
    Khan, Muhammad Asif Hossain
    Shoyaib, Mohammad
    2017 IEEE INTERNATIONAL CONFERENCE ON IMAGING, VISION & PATTERN RECOGNITION (ICIVPR), 2017,
  • [32] Discriminant Mutual Information for Text Feature Selection
    Wang, Jiaqi
    Zhang, Li
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 2021), PT II, 2021, 12682 : 136 - 151
  • [33] Feature Selection with Mutual Information for Regression Problems
    Sulaiman, Muhammad Aliyu
    Labadin, Jane
    2015 9TH INTERNATIONAL CONFERENCE ON IT IN ASIA (CITA), 2015,
  • [34] FEATURE SELECTION WITH WEIGHTED CONDITIONAL MUTUAL INFORMATION
    Celik, Ceyhun
    Bilge, Hasan Sakir
    JOURNAL OF THE FACULTY OF ENGINEERING AND ARCHITECTURE OF GAZI UNIVERSITY, 2015, 30 (04): : 585 - 596
  • [35] Gait Feature Subset Selection by Mutual Information
    Guo, Baofeng
    Nixon, Mark S.
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART A-SYSTEMS AND HUMANS, 2009, 39 (01): : 36 - 46
  • [36] Active Feature Selection for the Mutual Information Criterion
    Schnapp, Shachar
    Sabato, Sivan
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 9497 - 9504
  • [37] PCA based on mutual information for feature selection
    Fan, X.-L. (fanxueli@mail.ioa.ac.cn), 1600, Northeast University (28):
  • [38] Genetic algorithm for feature selection with mutual information
    Ge, Hong
    Hu, Tianliang
    2014 SEVENTH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN (ISCID 2014), VOL 1, 2014, : 116 - 119
  • [39] Feature Selection by Maximizing Part Mutual Information
    Gao, Wanfu
    Hu, Liang
    Zhang, Ping
    2018 INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND MACHINE LEARNING (SPML 2018), 2018, : 120 - 127
  • [40] Feature Selection with Conditional Mutual Information Considering Feature Interaction
    Liang, Jun
    Hou, Liang
    Luan, Zhenhua
    Huang, Weiping
    SYMMETRY-BASEL, 2019, 11 (07):