sCOs: Semi-Supervised Co-Selection by a Similarity Preserving Approach

被引:6
|
作者
Benabdeslem, Khalid [1 ]
Mansouri, Dou El Kefel [2 ]
Makkhongkaew, Raywat [3 ]
机构
[1] Univ Lyon1, LIRIS, CNRS, UMR5205, F-69622 Lyon, France
[2] Ibn Khaldoun Univ, BP P 78 Zaaroura, Tiaret 14000, Algeria
[3] State Railway Thailand SRT, Bangkok 10520, Thailand
关键词
Feature extraction; Task analysis; Semisupervised learning; Data mining; Robustness; Optimization; Supervised learning; Instance selection; feature selection; semi-supervised learning; similarity preserving; optimization; co-selection; INSTANCE SELECTION; CLASSIFIERS;
D O I
10.1109/TKDE.2020.3014262
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we focus on co-selection of instances and features in the semi-supervised learning scenario. In this context, co-selection becomes a more challenging problem as data contain labeled and unlabeled examples sampled from the same population. To carry out such semi-supervised co-selection, we propose a unified framework, called sCOs, which efficiently integrates labeled and unlabeled parts into the co-selection process. The framework is based on introducing both a sparse regularization term and a similarity preserving approach. It evaluates the usefulness of features and instances in order to select the most relevant ones, simultaneously. We propose two efficient algorithms that work for both convex and nonconvex functions. To the best of our knowledge, this paper offers, for the first time ever, a study utilizing nonconvex penalties for the co-selection of semi-supervised learning tasks. Experimental results on some known benchmark datasets are provided for validating sCOs and comparing it with some representative methods in the state-of-the art.
引用
收藏
页码:2899 / 2911
页数:13
相关论文
共 50 条
  • [1] Semi-supervised similarity preserving co-selection
    Makkhongkaew, Raywat
    Benabdeslem, Khalid
    [J]. 2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW), 2016, : 756 - 761
  • [2] Semi-supervised co-selection: features and instances by a weighting approach
    Makkhongkaew, Raywat
    Benabdeslem, Khalid
    Elghazel, Haytham
    [J]. 2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 3477 - 3484
  • [3] A semi-supervised approach for dimensionality reduction with distributional similarity
    Zheng, Feng
    Song, Zhan
    Shao, Ling
    Chung, Ronald
    Jia, Kui
    Wu, Xinyu
    [J]. NEUROCOMPUTING, 2013, 103 : 210 - 221
  • [4] Semi-Supervised Contrastive Learning With Similarity Co-Calibration
    Zhang, Yuhang
    Zhang, Xiaopeng
    Li, Jie
    Qiu, Robert C.
    Xu, Haohang
    Tian, Qi
    [J]. IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 1749 - 1759
  • [5] Feature-based approach to semi-supervised similarity learning
    Gosselin, Philippe H.
    Cord, Matthieu
    [J]. PATTERN RECOGNITION, 2006, 39 (10) : 1839 - 1851
  • [6] Weighting Based Approach for Semi-supervised Feature Selection
    Benabdeslem, Khalid
    Hindawi, Mohammed
    Makkhongkaew, Raywat
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2015, PT IV, 2015, 9492 : 300 - 307
  • [7] Semi-supervised constraints preserving hashing
    Wang, Di
    Gao, Xinbo
    Wang, Xiumei
    [J]. NEUROCOMPUTING, 2015, 167 : 230 - 242
  • [8] Semi-supervised Neighborhood Preserving Discriminant Embedding: A Semi-supervised Subspace Learning Algorithm
    Mehdizadeh, Maryam
    MacNish, Cara
    Khan, R. Nazim
    Bennamoun, Mohammed
    [J]. COMPUTER VISION - ACCV 2010, PT III, 2011, 6494 : 199 - +
  • [9] Semi-supervised metric learning via topology preserving multiple semi-supervised assumptions
    Wang, Qianying
    Yuen, Pong C.
    Feng, Guocan
    [J]. PATTERN RECOGNITION, 2013, 46 (09) : 2576 - 2587
  • [10] Local preserving logistic I-Relief for semi-supervised feature selection
    Tang, Baige
    Zhang, Li
    [J]. NEUROCOMPUTING, 2020, 399 : 48 - 64