Self-Paced Unified Representation Learning for Hierarchical Multi-Label Classification

被引:0
|
作者
Yuan, Zixuan [1 ]
Liu, Hao [1 ]
Zhou, Haoyi [2 ]
Zhang, Denghui [3 ]
Zhang, Xiao [4 ]
Wang, Hao [5 ]
Xiong, Hui [1 ]
机构
[1] Hong Kong Univ Sci & Technol Guangzhou, Hong Kong, Peoples R China
[2] Beihang Univ, Beijing, Peoples R China
[3] Stevens Inst Technol, Hoboken, NJ 07030 USA
[4] Shandong Univ, Jinan, Peoples R China
[5] Alibaba Grp, Hangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
ANNOTATION; ENSEMBLES;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hierarchical Multi-Label Classification (HMLC) is a wellestablished problem that aims at assigning data instances to multiple classes stored in a hierarchical structure. Despite its importance, existing approaches often face two key limitations: (i) They employ dense networks to solely explore the class hierarchy as hard criterion for maintaining taxonomic consistency among predicted classes, yet without leveraging rich semantic relationships between instances and classes; (ii) They struggle to generalize in settings with deep class levels, since the mini-batches uniformly sampled from different levels ignore the varying complexities of data and result in a non-smooth model adaptation to sparse data. To mitigate these issues, we present a Self-Paced Unified Representation (SPUR) learning framework, which focuses on the interplay between instance and classes to flexibly organize the training process of HMLC algorithms. Our framework consists of two lightweight encoders designed to capture the semantics of input features and the topological information of the class hierarchy. These encoders generate unified embeddings of instances and class hierarchy, which enable SPUR to exploit semantic dependencies between them and produce predictions in line with taxonomic constraints. Furthermore, we introduce a dynamic hardness measurement strategy that considers both class hierarchy and instance features to estimate the learning difficulty of each instance. This strategy is achieved by incorporating the propagation loss obtained at each hierarchical level, allowing for a more comprehensive assessment of learning complexity. Extensive experiments on several empirical benchmarks demonstrate the effectiveness and efficiency of SPUR compared to state-of-the-art methods, especially in scenarios with missing features.
引用
收藏
页码:16623 / 16632
页数:10
相关论文
共 50 条
  • [1] Self-Paced Multi-Label Learning with Diversity
    Seyedi, Seyed Amjad
    Ghodsi, S. Siamak
    Akhlaghian, Fardin
    Jalili, Mahdi
    Moradi, Parham
    ASIAN CONFERENCE ON MACHINE LEARNING, VOL 101, 2019, 101 : 790 - 805
  • [2] Self-paced multi-label co-training
    Gong, Yanlu
    Wu, Quanwang
    Zhou, Mengchu
    Wen, Junhao
    INFORMATION SCIENCES, 2023, 622 : 269 - 281
  • [3] Unsupervised person re-identification with multi-label learning guided self-paced clustering
    Li, Qing
    Peng, Xiaojiang
    Qiao, Yu
    Hao, Qi
    PATTERN RECOGNITION, 2022, 125
  • [4] Active learning for hierarchical multi-label classification
    Nakano, Felipe Kenji
    Cerri, Ricardo
    Vens, Celine
    DATA MINING AND KNOWLEDGE DISCOVERY, 2020, 34 (05) : 1496 - 1530
  • [5] Active learning for hierarchical multi-label classification
    Felipe Kenji Nakano
    Ricardo Cerri
    Celine Vens
    Data Mining and Knowledge Discovery, 2020, 34 : 1496 - 1530
  • [6] Multi-label classification via learning a unified object-label graph with sparse representation
    Yao, Lina
    Sheng, Quan Z.
    Ngu, Anne H. H.
    Gao, Byron J.
    Li, Xue
    Wang, Sen
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2016, 19 (06): : 1125 - 1149
  • [7] Multi-label classification via learning a unified object-label graph with sparse representation
    Lina Yao
    Quan Z. Sheng
    Anne H. H. Ngu
    Byron J. Gao
    Xue Li
    Sen Wang
    World Wide Web, 2016, 19 : 1125 - 1149
  • [8] Supervised representation learning for multi-label classification
    Ming Huang
    Fuzhen Zhuang
    Xiao Zhang
    Xiang Ao
    Zhengyu Niu
    Min-Ling Zhang
    Qing He
    Machine Learning, 2019, 108 : 747 - 763
  • [9] Supervised representation learning for multi-label classification
    Huang, Ming
    Zhuang, Fuzhen
    Zhang, Xiao
    Ao, Xiang
    Niu, Zhengyu
    Zhang, Min-Ling
    He, Qing
    MACHINE LEARNING, 2019, 108 (05) : 747 - 763
  • [10] Multi-Label Adversarial Attack With New Measures and Self-Paced Constraint Weighting
    Su, Fengguang
    Wu, Ou
    Zhu, Weiyao
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 : 3809 - 3822