Max-Margin feature selection

被引:5
|
作者
Prasad, Yamuna [1 ]
Khandelwal, Dinesh [1 ]
Biswas, K. K. [1 ]
机构
[1] Indian Inst Technol Delhi, Dept Comp Sci & Engn, New Delhi 110016, India
关键词
Feature selection; One class SVM; Max-Margin; INFORMATION;
D O I
10.1016/j.patrec.2017.04.011
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many machine learning applications such as in vision, biology and social networking deal with data in high dimensions. Feature selection is typically employed to select a subset of features which improves generalization accuracy as well as reduces the computational cost of learning the model. One of the criteria used for feature selection is to jointly minimize the redundancy and maximize the relevance of the selected features. In this paper, we formulate the task of feature selection as a one class SVM problem in a space where features correspond to the data points and instances correspond to the dimensions. The goal is to look for a representative subset of the features (support vectors) which describes the boundary for the region where the set of the features (data points) exists. This leads to a joint optimization of relevance and redundancy in a principled max-margin framework. Additionally, our formulation enables us to leverage existing techniques for optimizing the SVM objective resulting in highly computationally efficient solutions for the task of feature selection. Specifically, we employ the dual coordinate descent algorithm (Hsieh et al., 2008), originally proposed for SVMs, for our formulation. We use a sparse representation to deal with data in very high dimensions. Experiments on seven publicly available benchmark datasets from a variety of domains show that our approach results in orders of magnitude faster solutions even while retaining the same level of accuracy compared to the state of the art feature selection techniques. (C) 2017 Published by Elsevier B.V.
引用
收藏
页码:51 / 57
页数:7
相关论文
共 50 条
  • [1] Linear Cost-sensitive Max-margin Embedded Feature Selection for SVM
    Department of Business Administration, Emporia State University, Emporia
    KS
    66801, United States
    不详
    NY
    13902, United States
    Expert Sys Appl, 2022,
  • [2] Linear Cost-sensitive Max-margin Embedded Feature Selection for SVM
    Aram, Khalid Y.
    Lam, Sarah S.
    Khasawneh, Mohammad T.
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 197
  • [3] Max-Margin Token Selection in Attention Mechanism
    Tarzanagh, Davoud Ataee
    Li, Yingcong
    Zhang, Xuechen
    Oymak, Samet
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [4] Structural max-margin discriminant analysis for feature extraction
    Chen, Xiaobo
    Xiao, Yan
    Cai, Yinfeng
    Chen, Long
    KNOWLEDGE-BASED SYSTEMS, 2014, 70 : 154 - 166
  • [5] Nonlinear Feature Extraction with Max-Margin Data Shifting
    Wangni, Jianqiao
    Chen, Ning
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 2208 - 2214
  • [6] On the Consistency of Max-Margin Losses
    Nowak-Vila, Alex
    Rudi, Alessandro
    Bach, Francis
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [7] Max-Margin Contrastive Learning
    Shah, Anshul
    Sra, Suvrit
    Chellappa, Rama
    Cherian, Anoop
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 8220 - 8230
  • [8] Max-margin Markov networks
    Taskar, B
    Guestrin, C
    Koller, D
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 16, 2004, 16 : 25 - 32
  • [9] Cost-sensitive max-margin feature selection for SVM using alternated sorting method genetic algorithm
    Aram, Khalid Y.
    Lam, Sarah S.
    Khasawneh, Mohammad T.
    KNOWLEDGE-BASED SYSTEMS, 2023, 267
  • [10] Kernel-based Joint Feature Selection and Max-Margin Classification for Early Diagnosis of Parkinson’s Disease
    Ehsan Adeli
    Guorong Wu
    Behrouz Saghafi
    Le An
    Feng Shi
    Dinggang Shen
    Scientific Reports, 7