Effective Facial Obstructions Removal with Enhanced Cycle-Consistent Generative Adversarial Networks

被引:2
|
作者
Wang, Yuming [1 ,2 ]
Ou, Xiao [1 ]
Tu, Lai [1 ]
Liu, Ling [2 ]
机构
[1] Huazhong Univ Sci & Technol, Sch Elect Informat & Commun, Wuhan 430074, Hubei, Peoples R China
[2] Georgia Inst Technol, Sch Comp Sci, Coll Comp, Atlanta, GA 30332 USA
基金
美国国家科学基金会;
关键词
Face recognition; Obstructions removal; Convolutional Neural Networks; Generative Adversarial Networks;
D O I
10.1007/978-3-319-94361-9_16
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Face recognition has becoming an important and popular authentication technology for web services and mobile applications in recent years. The quality of facial obstruction removal is a critical component of face recognition, especially for mission critical applications such as facial recognition based authentication systems. It is well known that some facial obstructions may severely affect the extraction and recognition quality and accuracy of facial features, which in turn disturbs the prediction accuracy of facial recognition model and algorithms. In this paper, we propose a Facial Obstructions Removal Scheme (FORS) based on an Enhanced Cycle-Consistent Generative Adversarial Networks (ECGAN) for face recognition. By training a convolution neural network based facial image classifier, we identify those images that contain facial obstructions. Then the images with facial obstructions are processed by using the facial image converter of FORS and the ECGAN model, which removes facial obstructions seamlessly while preserving the facial features. Our experimental results show that the proposed FORS scheme improves the face recognition accuracy over some existing state of art approaches.
引用
收藏
页码:210 / 220
页数:11
相关论文
共 50 条
  • [1] Removal of Visual Disruption Caused by Rain Using Cycle-Consistent Generative Adversarial Networks
    Tang, Lai Meng
    Lim, Li Hong
    Siebert, Paul
    [J]. COMPUTER VISION - ECCV 2018 WORKSHOPS, PT V, 2019, 11133 : 551 - 566
  • [2] Unsupervised Cycle-Consistent Generative Adversarial Networks for Pan Sharpening
    Zhou, Huanyu
    Liu, Qingjie
    Weng, Dawei
    Wang, Yunhong
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [3] On Translation and Reconstruction Guarantees of the Cycle-Consistent Generative Adversarial Networks
    Chakrabarty, Anish
    Das, Swagatam
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [4] Adversarial sample attacks algorithm based on cycle-consistent generative networks
    Gao, Haoqi
    Yang, Xing
    Hu, Yihua
    Liang, Zhenyu
    Xu, Haoli
    Wang, Bingwen
    Mu, Hua
    Wang, Yangyang
    [J]. APPLIED SOFT COMPUTING, 2024, 162
  • [5] Lund jet images from generative and cycle-consistent adversarial networks
    Carrazza, Stefano
    Dreyer, Frederic A.
    [J]. EUROPEAN PHYSICAL JOURNAL C, 2019, 79 (11):
  • [6] Normalization of breast MRIs using cycle-consistent generative adversarial networks
    Modanwal, Gourav
    Vellal, Adithya
    Mazurowski, Maciej A.
    [J]. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2021, 208
  • [7] Lund jet images from generative and cycle-consistent adversarial networks
    Stefano Carrazza
    Frédéric A. Dreyer
    [J]. The European Physical Journal C, 2019, 79
  • [8] Bayesian Cycle-Consistent Generative Adversarial Networks via Marginalizing Latent Sampling
    You, Haoran
    Cheng, Yu
    Cheng, Tianheng
    Li, Chunliang
    Zhou, Pan
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (10) : 4389 - 4403
  • [9] Cycle-Consistent Adversarial Networks for Smoke Detection and Removal in Endoscopic Images
    Hu, Zhisen
    Hu, Xiyuan
    [J]. 2021 43RD ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE & BIOLOGY SOCIETY (EMBC), 2021, : 3070 - 3073
  • [10] Cycle-consistent Conditional Adversarial Transfer Networks
    Li, Jingjing
    Chen, Erpeng
    Ding, Zhengming
    Zhu, Lei
    Lu, Ke
    Huang, Zi
    [J]. PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA (MM'19), 2019, : 747 - 755