Supervised Self-taught Learning: Actively Transferring Knowledge from Unlabeled Data

被引:0
|
作者
Huang, Kaizhu [1 ]
Xu, Zenglin [2 ]
King, Irwin [2 ]
Lyu, Michael R. [2 ]
Campbell, Colin [1 ]
机构
[1] Univ Bristol, Dept Engn Math, Bristol BS8 1TR, Avon, England
[2] Chinese Univ Hong Kong, Dept Comp Sci & Engn, Shatin, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the task of Self-taught Learning (STL) from unlabeled data. In contrast to semi-supervised learning, which requires unlabeled data to have the same set of class labels as labeled data, STL can transfer knowledge from different types of unlabeled data. STL uses a three-step strategy: (1) learning high-level representations from unlabeled data only, (2) re-constructing the labeled data via such representations and (3) building a classifier over the re-constructed labeled data. However, the high-level representations which are exclusively determined by the unlabeled data, may be inappropriate or even misleading for the latter classifier training process. In this paper, we propose a novel Supervised Self-taught Learning (SSTL) framework that successfully integrates the three isolated steps of STL into a single optimization problem. Benefiting from the interaction between the classifier optimization and the process of choosing high-level representations, the proposed model is able to select those discriminative representations which are more appropriate for classification. One important feature of our novel framework is that the final optimization can be iteratively solved with convergence guaranteed. We evaluate our novel framework on various data sets. The experimental results show that the proposed SSTL can outperform STL and traditional supervised learning methods in certain instances.
引用
收藏
页码:481 / +
页数:2
相关论文
共 50 条
  • [1] SelfCCL: Curriculum Contrastive Learning by Transferring Self-Taught Knowledge for Fine-Tuning BERT
    Dehghan, Somaiyeh
    Amasyali, Mehmet Fatih
    APPLIED SCIENCES-BASEL, 2023, 13 (03):
  • [2] Deep Self-Taught Learning for Weakly Supervised Object Localization
    Jie, Zequn
    Wei, Yunchao
    Jin, Xiaojie
    Feng, Jiashi
    Liu, Wei
    30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 4294 - 4302
  • [3] Self-Taught Active Learning from Crowds
    Fang, Meng
    Zhu, Xingquan
    Li, Bin
    Ding, Wei
    Wu, Xindong
    12TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2012), 2012, : 858 - 863
  • [4] Semi-Supervised Self-Taught Deep Learning for Finger Bones Segmentation
    Zhao, Ziyuan
    Zhang, Xiaoman
    Chen, Cen
    Li, Wei
    Peng, Songyou
    Wang, Jie
    Yang, Xulei
    Zhang, Le
    Zeng, Zeng
    2019 IEEE EMBS INTERNATIONAL CONFERENCE ON BIOMEDICAL & HEALTH INFORMATICS (BHI), 2019,
  • [5] Self-taught Recovery of Depth Data
    Yang, Pan
    Zhao, Haoran
    Qi, Lin
    Zhong, Guoqiang
    2015 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA), 2015, : 1270 - 1275
  • [6] Suppressing Memory and Knowledge: A Self-taught Pedagogy
    Jabbar, Abdul
    CHANGING ENGLISH-STUDIES IN CULTURE AND EDUCATION, 2012, 19 (03): : 307 - 312
  • [7] Self-Taught Anomaly Detection With Hybrid Unsupervised/Supervised Machine Learning in Optical Networks
    Chen, Xiaoliang
    Li, Baojia
    Proietti, Roberto
    Zhu, Zuqing
    Ben Yoo, S. J.
    JOURNAL OF LIGHTWAVE TECHNOLOGY, 2019, 37 (07) : 1742 - 1749
  • [8] Self-Taught Metric Learning without Labels
    Kim, Sungyeon
    Kim, Dongwon
    Cho, Minsu
    Kwak, Suha
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 7421 - 7431
  • [9] On the benefits of self-taught learning for brain decoding
    Germani, Elodie
    Fromont, Elisa
    Maumet, Camille
    GIGASCIENCE, 2023, 12
  • [10] On the benefits of self-taught learning for brain decoding
    Germani, Elodie
    Fromont, Elisa
    Maumet, Camille
    GIGASCIENCE, 2023, 12