Adaptive Scheme of Clustering-Based Unsupervised Learning for Person Re-identification

被引:0
|
作者
Anh-Vu Vo Duy [2 ,3 ]
Quang-Huy Che [1 ,2 ,3 ]
Vinh-Tiep Nguyen [1 ,2 ,3 ]
机构
[1] Multimedia Commun Lab, Ho Chi Minh City, Vietnam
[2] Univ Informat Technol, Ho Chi Minh City, Vietnam
[3] Vietnam Natl Univ, Ho Chi Minh City, Vietnam
关键词
Person re-identification; Unsupervised learning; Adaptive scheme;
D O I
10.1007/978-981-97-4985-0_16
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Clustering and cluster-level contrastive learning are widely used functions in unsupervised re-identification methods, with a focus on extracting robust and distinctive features from data without annotations. However, existing approaches often overlook the correlation of the re-ID modules related hyperparameters with the considerable shrinkage in cluster density. This issue potentially lead to misaligned cluster representation vectors taken into account of computing cluster-level contrastive loss. Therefore, it might hinder the model's performance. To address this problem, we propose a novel method called Adaptive Scheme of Clustering-based Unsupervised Learning (ASCUL). In contrast to approaches that rely on predefined clustering hyperparameters, we incorporate a regulator executing adaptive adjustments to maximize the number of informative samples during training. Furthermore, our scheme for mining cluster representations adapts dynamically to substantial changes in intra-class variations, efficiently utilizing the clusterwise loss. Experiments on two benchmark datasets consistently show that our new approach outperforms state-of-the-art unsupervised person re-ID methods.
引用
收藏
页码:193 / 205
页数:13
相关论文
共 50 条
  • [21] Hybrid feature constraint with clustering for unsupervised person re-identification
    Tongzhen Si
    Fazhi He
    Penglei Li
    The Visual Computer, 2023, 39 : 5121 - 5133
  • [22] Hybrid feature constraint with clustering for unsupervised person re-identification
    Si, Tongzhen
    He, Fazhi
    Li, Penglei
    VISUAL COMPUTER, 2023, 39 (10): : 5121 - 5133
  • [23] CLUSTERING AND DYNAMIC SAMPLING BASED UNSUPERVISED DOMAIN ADAPTATION FOR PERSON RE-IDENTIFICATION
    Wu, Jinlin
    Liao, Shengcai
    Lei, Zhen
    Wang, Xiaobo
    Yang, Yang
    Li, Stan Z.
    2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2019, : 886 - 891
  • [24] Unsupervised Person Re-identification Based on Clustering and Domain-Invariant Network
    Huang, Yangru
    Jin, Yi
    Peng, Peixi
    Lang, Congyan
    Li, Yidong
    IMAGE AND GRAPHICS, ICIG 2019, PT III, 2019, 11903 : 517 - 528
  • [25] Successive Consensus Clustering for Unsupervised Video-Based Person Re-Identification
    Qian, Jinhao
    Xie, Xiaohua
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 822 - 826
  • [26] Unsupervised Person Re-identification by Soft Multilabel Learning
    Yu, Hong-Xing
    Zheng, Wei-Shi
    Wu, Ancong
    Guo, Xiaowei
    Gong, Shaogang
    Lai, Jian-Huang
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 2143 - 2152
  • [27] Central Feature Learning for Unsupervised Person Re-identification
    Wang, Binquan
    Asim, Muhammad
    Ma, Guoqi
    Zhu, Ming
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2021, 35 (08)
  • [28] Camera Contrast Learning for Unsupervised Person Re-Identification
    Zhang, Guoqing
    Zhang, Hongwei
    Lin, Weisi
    Chandran, Arun Kumar
    Jing, Xuan
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (08) : 4096 - 4107
  • [29] Unsupervised Attention Based Instance Discriminative Learning for Person Re-Identification
    Nikhal, Kshitij
    Riggan, Benjamin S.
    2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WACV 2021, 2021, : 2421 - 2430
  • [30] Transformer-based Contrastive Learning for Unsupervised Person Re-Identification
    Tao, Yusheng
    Zhang, Jian
    Chen, Tianquan
    Wang, Yuqing
    Zhu, Yuesheng
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,