Pseudo-positive regularization for deep person re-identification

被引:0
|
作者
Fuqing Zhu
Xiangwei Kong
Haiyan Fu
Qi Tian
机构
[1] Dalian University of Technology,School of Information and Communication Engineering
[2] University of Texas at San Antonio,Department of Computer Science
来源
Multimedia Systems | 2018年 / 24卷
关键词
Convolutional neural network; Pseudo-Positive Regularization; Person re-identification;
D O I
暂无
中图分类号
学科分类号
摘要
An intrinsic challenge of person re-identification (re-ID) is the annotation difficulty. This typically means (1) few training samples per identity and (2) thus the lack of diversity among the training samples. Consequently, we face high risk of over-fitting when training the convolutional neural network (CNN), a state-of-the-art method in person re-ID. To reduce the risk of over-fitting, this paper proposes a Pseudo-Positive Regularization method to enrich the diversity of the training data. Specifically, unlabeled data from an independent pedestrian database are retrieved using the target training data as query. A small proportion of these retrieved samples are randomly selected as the Pseudo-Positive samples and added to the target training set for the supervised CNN training. The addition of Pseudo-Positive samples is therefore a Data Augmentation method to reduce the risk of over-fitting during CNN training. We implement our idea in the identification CNN models (i.e., CaffeNet, VGGNet-16 and ResNet-50). On CUHK03 and Market-1501 datasets, experimental results demonstrate that the proposed method consistently improves the baseline and yields competitive performance to the state-of-the-art person re-ID methods.
引用
收藏
页码:477 / 489
页数:12
相关论文
共 50 条
  • [21] Angular regularization for unsupervised domain adaption on person re-identification
    Wenfeng Zhang
    Lei Huang
    Zhiqiang Wei
    Qibing Qin
    Lei Lv
    Neural Computing and Applications, 2021, 33 : 17041 - 17056
  • [22] Discriminative feature mining with relation regularization for person re-identification
    Yang, Jing
    Zhang, Canlong
    Li, Zhixin
    Tang, Yanping
    Wang, Zhiwen
    INFORMATION PROCESSING & MANAGEMENT, 2023, 60 (03)
  • [23] Angular regularization for unsupervised domain adaption on person re-identification
    Zhang, Wenfeng
    Huang, Lei
    Wei, Zhiqiang
    Qin, Qibing
    Lv, Lei
    Neural Computing and Applications, 2021, 33 (24) : 17041 - 17056
  • [24] Deep-Person: Learning discriminative deep features for person Re-Identification
    Bai, Xiang
    Yang, Mingkun
    Huang, Tengteng
    Dou, Zhiyong
    Yu, Rui
    Xu, Yongchao
    PATTERN RECOGNITION, 2020, 98
  • [25] Lifelong Person Re-identification by Pseudo Task Knowledge Preservation
    Ge, Wenhang
    Du, Junlong
    Wu, Ancong
    Xian, Yuqiao
    Yan, Ke
    Huang, Feiyue
    Zheng, Wei-Shi
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 688 - 696
  • [26] Adaptation and Re-Identification Network: An Unsupervised Deep Transfer Learning Approach to Person Re-Identification
    Li, Yu-Jhe
    Yang, Fu-En
    Liu, Yen-Cheng
    Yeh, Yu-Ying
    Du, Xiaofei
    Wang, Yu-Chiang Frank
    PROCEEDINGS 2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW), 2018, : 285 - 291
  • [27] Person Re-identification
    Bak, Slawomir
    Bremond, Francois
    ERCIM NEWS, 2013, (95): : 33 - 34
  • [28] Multilevel deep representation fusion for person re-identification
    Zhao, Yu
    Fu, Keren
    Shu, Qiaoyuan
    Wei, Pengcheng
    Shi, Xi
    JOURNAL OF ELECTRONIC IMAGING, 2020, 29 (02)
  • [29] A survey of person re-identification based on deep learning
    Li Q.
    Hu W.-Y.
    Li J.-Y.
    Liu Y.
    Li M.-X.
    Gongcheng Kexue Xuebao/Chinese Journal of Engineering, 2022, 44 (05): : 920 - 932
  • [30] Person Re-Identification Research via Deep Learning
    Lu Jian
    Chen Xu
    Luo Maoxin
    Wang Hangying
    LASER & OPTOELECTRONICS PROGRESS, 2020, 57 (16)