Extracting human attributes using a convolutional neural network approach

被引:34
|
作者
Perlin, Hugo Alberto [1 ]
Lopes, Heitor Silverio [2 ]
机构
[1] Parana Fed Inst Parana, Paranagua, PR, Brazil
[2] Univ Tecnol Fed Parana, Curitiba, Parana, Brazil
关键词
Computer vision; Machine learning; Soft-biometrics; Convolutional Neural Network; Gender recognition; Clothes parsing; CLASSIFICATION; FEATURES; SCALE;
D O I
10.1016/j.patrec.2015.07.012
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Extracting high level information from digital images and videos is a hard problem frequently faced by the computer vision and machine learning communities. Modern surveillance systems can monitor people, cars or objects by using computer vision methods. The objective of this work is to propose a method for identifying soft biometrics, in the form of clothing and gender, from images containing people, as a previous step for further identifying people themselves. We propose a solution to this classification problem using a Convolutional Neural Network, working as an all-in-one feature extractor and classifier. This method allows the development of a high-level end-to-end clothing/gender classifier. Experiments were done comparing the CNN with hand-designed classifiers. Also, two different operating modes of CNN are proposed and coin pared each other. The results obtained were very promising, showing that is possible to extract soft-biometrics attributes using an end-to-end CNN classifier. The proposed method achieved a good generalization capability, classifying the three different attributes with good accuracy. This suggests the possibility to search images using soft biometrics as search terms. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:250 / 259
页数:10
相关论文
共 50 条
  • [21] Anti-spoofing Approach Using Deep Convolutional Neural Network
    Chatterjee, Prosenjit
    Roy, Kaushik
    RECENT TRENDS AND FUTURE TECHNOLOGY IN APPLIED INTELLIGENCE, IEA/AIE 2018, 2018, 10868 : 745 - 750
  • [22] A Foreground Extraction Approach Using Convolutional Neural Network with Graph Cut
    Utah, Matee
    Iltaf, Adnan
    Hou, Qiujun
    Ali, Farman
    Liu, Chuancai
    2018 IEEE 3RD INTERNATIONAL CONFERENCE ON IMAGE, VISION AND COMPUTING (ICIVC), 2018, : 40 - 44
  • [23] A New Approach to Classify Drones Using a Deep Convolutional Neural Network
    Rakshit, Hrishi
    Zadeh, Pooneh Bagheri
    DRONES, 2024, 8 (07)
  • [24] A novel bearing fault detection approach using a convolutional neural network
    Aydin, Tolga
    Erdem, Ebru
    Erkayman, Burak
    Kocadagistan, Mustafa Engin
    Teker, Tanju
    MATERIALS TESTING, 2024, 66 (04) : 478 - 492
  • [25] A deep convolutional neural network approach using medical image classification
    Mousavi, Mohammad
    Hosseini, Soodeh
    BMC MEDICAL INFORMATICS AND DECISION MAKING, 2024, 24 (01)
  • [26] A Robust RF Fingerprinting Approach Using Multisampling Convolutional Neural Network
    Yu, Jiabao
    Hu, Aiqun
    Li, Guyue
    Peng, Linning
    IEEE INTERNET OF THINGS JOURNAL, 2019, 6 (04): : 6786 - 6799
  • [27] A Modern Approach for Sign Language Interpretation Using Convolutional Neural Network
    Paul, Pias
    Bhuiya, Moh. Anwar-Ul-Azim
    Ullah, Md. Ayat
    Saqib, Molla Nazmus
    Mohammed, Nabeel
    Momen, Sifat
    PRICAI 2019: TRENDS IN ARTIFICIAL INTELLIGENCE, PT III, 2019, 11672 : 431 - 444
  • [28] A Novel Approach to Identify the Brain Tumour Using Convolutional Neural Network
    Khari S.
    Gupta D.
    Chaudhary A.
    Bhatla R.
    EAI Endorsed Transactions on Pervasive Health and Technology, 2023, 9 (01)
  • [29] Efficient Approach for Rhopalocera Classification Using Growing Convolutional Neural Network
    Kaur, Iqbaldeep
    Goyal, Lalit Mohan
    Ghansiyal, Adrija
    Hemanth, D. Jude
    INTERNATIONAL JOURNAL OF UNCERTAINTY FUZZINESS AND KNOWLEDGE-BASED SYSTEMS, 2022, 30 (03) : 499 - 512
  • [30] Extracting Fresnel zones from migrated dip-angle gathers using a convolutional neural network
    Cheng, Qian
    Zhang, Jianfeng
    Liu, Wei
    EXPLORATION GEOPHYSICS, 2021, 52 (02) : 211 - 220