Extracting human attributes using a convolutional neural network approach

被引:34
|
作者
Perlin, Hugo Alberto [1 ]
Lopes, Heitor Silverio [2 ]
机构
[1] Parana Fed Inst Parana, Paranagua, PR, Brazil
[2] Univ Tecnol Fed Parana, Curitiba, Parana, Brazil
关键词
Computer vision; Machine learning; Soft-biometrics; Convolutional Neural Network; Gender recognition; Clothes parsing; CLASSIFICATION; FEATURES; SCALE;
D O I
10.1016/j.patrec.2015.07.012
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Extracting high level information from digital images and videos is a hard problem frequently faced by the computer vision and machine learning communities. Modern surveillance systems can monitor people, cars or objects by using computer vision methods. The objective of this work is to propose a method for identifying soft biometrics, in the form of clothing and gender, from images containing people, as a previous step for further identifying people themselves. We propose a solution to this classification problem using a Convolutional Neural Network, working as an all-in-one feature extractor and classifier. This method allows the development of a high-level end-to-end clothing/gender classifier. Experiments were done comparing the CNN with hand-designed classifiers. Also, two different operating modes of CNN are proposed and coin pared each other. The results obtained were very promising, showing that is possible to extract soft-biometrics attributes using an end-to-end CNN classifier. The proposed method achieved a good generalization capability, classifying the three different attributes with good accuracy. This suggests the possibility to search images using soft biometrics as search terms. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:250 / 259
页数:10
相关论文
共 50 条
  • [1] A novel approach for human skin detection using convolutional neural network
    Khawla Ben Salah
    Mohamed Othmani
    Monji Kherallah
    The Visual Computer, 2022, 38 : 1833 - 1843
  • [2] A novel approach for human skin detection using convolutional neural network
    Ben Salah, Khawla
    Othmani, Mohamed
    Kherallah, Monji
    VISUAL COMPUTER, 2022, 38 (05): : 1833 - 1843
  • [3] Extracting atmospheric turbulence phase using deep convolutional neural network
    Xu Qi-Wei
    Wang Pei-Pei
    Zeng Zhen-Jia
    Huang Ze-Bin
    Zhou Xin-Xing
    Liu Jun-Min
    Li Ying
    Chen Shu-Qing
    Fan Dian-Yuan
    ACTA PHYSICA SINICA, 2020, 69 (01)
  • [4] Relative Attributes with Deep Convolutional Neural Network
    Kim, Dong-Jin
    Yoo, Donggeun
    Im, Sunghoon
    Kim, Namil
    Sirinukulwattana, Tharatch
    Kweon, In So
    2015 12TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAI), 2015, : 157 - 158
  • [5] An Approach for Biometric Verification Based on Human Body Communication using Convolutional Neural Network
    Li, Jingzhen
    Liu, Yuhang
    Igbe, Tobore
    Nie, Zedong
    2019 IEEE 9TH INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS (ICCE-BERLIN), 2019, : 12 - 15
  • [6] Extracting Wetland Type Information with a Deep Convolutional Neural Network
    Guan, XianMing
    Wang, Di
    Wan, Luhe
    Zhang, Jiyi
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [7] Extracting Lamb wave vibrating modes with convolutional neural network
    He, Juxing
    Tian, Yahui
    Li, Honglang
    Lu, Zixiao
    Yang, Guiting
    Lan, Jianyu
    JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA, 2022, 151 (04): : 2290 - 2296
  • [8] Tomato disease recognition using a convolutional neural network approach
    Hang, Xiao
    Gao, Hongju
    International Agricultural Engineering Journal, 2019, 28 (03): : 241 - 248
  • [9] A Novel Approach for Sentiment Classification by Using Convolutional Neural Network
    Kalaivani, M. S.
    Jayalakshmi, S.
    PROCEEDINGS OF SECOND INTERNATIONAL CONFERENCE ON SUSTAINABLE EXPERT SYSTEMS (ICSES 2021), 2022, 351 : 143 - 152
  • [10] Human action interpretation using convolutional neural network: a survey
    Zainab Malik
    Mohd Ibrahim Bin Shapiai
    Machine Vision and Applications, 2022, 33