Self-Paced Unified Representation Learning for Hierarchical Multi-Label Classification

被引:0
|
作者
Yuan, Zixuan [1 ]
Liu, Hao [1 ]
Zhou, Haoyi [2 ]
Zhang, Denghui [3 ]
Zhang, Xiao [4 ]
Wang, Hao [5 ]
Xiong, Hui [1 ]
机构
[1] Hong Kong Univ Sci & Technol Guangzhou, Hong Kong, Peoples R China
[2] Beihang Univ, Beijing, Peoples R China
[3] Stevens Inst Technol, Hoboken, NJ 07030 USA
[4] Shandong Univ, Jinan, Peoples R China
[5] Alibaba Grp, Hangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
ANNOTATION; ENSEMBLES;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hierarchical Multi-Label Classification (HMLC) is a wellestablished problem that aims at assigning data instances to multiple classes stored in a hierarchical structure. Despite its importance, existing approaches often face two key limitations: (i) They employ dense networks to solely explore the class hierarchy as hard criterion for maintaining taxonomic consistency among predicted classes, yet without leveraging rich semantic relationships between instances and classes; (ii) They struggle to generalize in settings with deep class levels, since the mini-batches uniformly sampled from different levels ignore the varying complexities of data and result in a non-smooth model adaptation to sparse data. To mitigate these issues, we present a Self-Paced Unified Representation (SPUR) learning framework, which focuses on the interplay between instance and classes to flexibly organize the training process of HMLC algorithms. Our framework consists of two lightweight encoders designed to capture the semantics of input features and the topological information of the class hierarchy. These encoders generate unified embeddings of instances and class hierarchy, which enable SPUR to exploit semantic dependencies between them and produce predictions in line with taxonomic constraints. Furthermore, we introduce a dynamic hardness measurement strategy that considers both class hierarchy and instance features to estimate the learning difficulty of each instance. This strategy is achieved by incorporating the propagation loss obtained at each hierarchical level, allowing for a more comprehensive assessment of learning complexity. Extensive experiments on several empirical benchmarks demonstrate the effectiveness and efficiency of SPUR compared to state-of-the-art methods, especially in scenarios with missing features.
引用
收藏
页码:16623 / 16632
页数:10
相关论文
共 50 条
  • [31] Extreme Learning Machine for Supervised Classification with Self-paced Learning
    Li, Li
    Zhao, Kaiyi
    Li, Sicong
    Sun, Ruizhi
    Cai, Saihua
    NEURAL PROCESSING LETTERS, 2020, 52 (03) : 1723 - 1744
  • [32] Self-paced ensemble learning for speech and audio classification
    Ristea, Nicolae-Catalin
    Ionescu, Radu Tudor
    INTERSPEECH 2021, 2021, : 2836 - 2840
  • [33] Learning Hierarchical Multi-label Classification Trees from Network Data
    Stojanova, Daniela
    Ceci, Michelangelo
    Malerba, Donato
    Dzeroski, Saso
    DISCOVERY SCIENCE, 2013, 8140 : 233 - 248
  • [34] Cognitive structure learning model for hierarchical multi-label text classification
    Wang, Boyan
    Hu, Xuegang
    Li, Peipei
    Yu, Philip S.
    KNOWLEDGE-BASED SYSTEMS, 2021, 218
  • [35] Applying semi-supervised learning in hierarchical multi-label classification
    Santos, Araken
    Canuto, Anne
    EXPERT SYSTEMS WITH APPLICATIONS, 2014, 41 (14) : 6075 - 6085
  • [36] Hierarchical Multi-label Classification using Fully Associative Ensemble Learning
    Zhang, L.
    Shah, S. K.
    Kakadiaris, I. A.
    PATTERN RECOGNITION, 2017, 70 : 89 - 103
  • [37] Cost-Effective Active Learning for Hierarchical Multi-Label Classification
    Yan, Yi-Fan
    Huang, Sheng-Jun
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 2962 - 2968
  • [38] A Self-Paced Regularization Framework for Partial-Label Learning
    Lyu, Gengyu
    Feng, Songhe
    Wang, Tao
    Lang, Congyan
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (02) : 899 - 911
  • [39] Extreme Learning Machine for Supervised Classification with Self-paced Learning
    Li Li
    Kaiyi Zhao
    Sicong Li
    Ruizhi Sun
    Saihua Cai
    Neural Processing Letters, 2020, 52 : 1723 - 1744
  • [40] Partial Label Learning via Self-Paced Curriculum Strategy
    Lyu, Gengyu
    Feng, Songhe
    Jin, Yi
    Li, Yidong
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2020, PT II, 2021, 12458 : 489 - 505