An efficient method for feature selection in linear regression based on an extended Akaike's information criterion

被引:0
|
作者
Vetrov, D. P. [1 ]
Kropotov, D. A. [2 ]
Ptashko, N. O. [1 ]
机构
[1] Moscow MV Lomonosov State Univ, Fac Computat Math & Cybernet, Moscow 119992, Russia
[2] Russian Acad Sci, Dorodnicyn Comp Ctr, Moscow 119333, Russia
基金
俄罗斯基础研究基金会;
关键词
pattern recognition; linear regression; feature selection; Akaike's information criterion;
D O I
10.1134/S096554250911013X
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
A method for feature selection in linear regression based on an extension of Akaike's information criterion is proposed. The use of classical Akaike's information criterion (AIC) for feature selection assumes the exhaustive search through all the subsets of features, which has unreasonably high computational and time cost. A new information criterion is proposed that is a continuous extension of AIC. As a result, the feature selection problem is reduced to a smooth optimization problem. An efficient procedure for solving this problem is derived. Experiments show that the proposed method enables one to efficiently select features in linear regression. In the experiments, the proposed procedure is compared with the relevance vector machine, which is a feature selection method based on Bayesian approach. It is shown that both procedures yield similar results. The main distinction of the proposed method is that certain regularization coefficients are identical zeros. This makes it possible to avoid the underfitting effect, which is a characteristic feature of the relevance vector machine. A special case (the so-called nondiagonal regularization) is considered in which both methods are identical.
引用
下载
收藏
页码:1972 / 1985
页数:14
相关论文
共 50 条
  • [1] An efficient method for feature selection in linear regression based on an extended Akaike’s information criterion
    D. P. Vetrov
    D. A. Kropotov
    N. O. Ptashko
    Computational Mathematics and Mathematical Physics, 2009, 49 : 1972 - 1985
  • [2] The reliability of the Akaike information criterion method in cosmological model selection
    Tan, M. Y. J.
    Biswas, Rahul
    MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, 2012, 419 (04) : 3292 - 3303
  • [3] Marker selection by Akaike information criterion and Bayesian information criterion
    Li, WT
    Nyholt, DR
    GENETIC EPIDEMIOLOGY, 2001, 21 : S272 - S277
  • [4] Smoothing parameter selection in nonparametric regression using an improved Akaike information criterion
    Hurvich, CM
    Simonoff, JS
    Tsai, CL
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 1998, 60 : 271 - 293
  • [5] Minimization of Akaike's information criterion in linear regression analysis via mixed integer nonlinear program
    Kimura, Keiji
    Waki, Hayato
    OPTIMIZATION METHODS & SOFTWARE, 2018, 33 (03): : 633 - 649
  • [6] Uninformative Parameters and Model Selection Using Akaike's Information Criterion
    Arnold, Todd W.
    JOURNAL OF WILDLIFE MANAGEMENT, 2010, 74 (06): : 1175 - 1178
  • [7] Extending the akaike information criterion to mixture regression models
    Naik, Prasad A.
    Shi, Peide
    Tsai, Chih-Ling
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2007, 102 (477) : 244 - 254
  • [8] The Akaike information criterion in weighted regression of immittance data
    Ingdal, Mats
    Johnsen, Roy
    Harrington, David A.
    ELECTROCHIMICA ACTA, 2019, 317 : 648 - 653
  • [9] Efficient feature selection based on information gain criterion for face recognition
    Dhir, Chandra Shekhar
    Iqbal, Nadeem
    Lee, Soo-Young
    2007 INTERNATIONAL CONFERENCE ON INFORMATION ACQUISITION, VOLS 1 AND 2, 2007, : 524 - 528
  • [10] An optimization-based algorithm for model selection using an approximation of Akaike's Information Criterion
    Carvajal, Rodrigo
    Urrutia, Gabriel
    Aguero, Juan C.
    2016 AUSTRALIAN CONTROL CONFERENCE (AUCC), 2016, : 217 - 220