Benchmarking Shadow Removal for Facial Landmark Detection

被引:0
|
作者
Fu, Lan [1 ]
Guo, Qing [2 ,3 ]
Juefei-Xu, Felix [4 ]
Yu, Hongkai [5 ]
Liu, Yang [6 ]
Feng, Wei [7 ]
Wang, Song [1 ]
机构
[1] Univ South Carolina, Columbia, SC USA
[2] ASTAR, IHPC, Singapore, Singapore
[3] ASTAR, CFAR, Singapore, Singapore
[4] NYU, New York, NY USA
[5] Cleveland State Univ, Cleveland, OH USA
[6] Nanyang Technol Univ, Singapore, Singapore
[7] Tianjin Univ, Tianjin, Peoples R China
基金
新加坡国家研究基金会;
关键词
Face shadow; shadow removal; Facial Landmark detection;
D O I
10.1109/CAI59869.2024.00059
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Facial landmark detection is a very fundamental task and its accuracy plays a significant role for many downstream face-related vision applications. In practice, the facial landmark detection can be affected by a lot of natural degradations. One of the most common and important degradations is the shadow caused by light source being blocked with an external occluder. While many advanced shadow removal methods have been proposed to restore the image quality in recent years, their effects on facial landmark detection are not well studied. For example, it remains unclear whether the shadow removal could enhance the robustness of facial landmark detection to diverse shadow patterns or not. In this work, for the first time, we construct a novel benchmark (i.e., SHAREL) to link the two independent but relatable tasks (i.e., shadow removal and facial landmark detection). In particular, SHAREL covers diverse face shadows with different intensities, sizes, shapes, and locations. Moreover, to mine hard shadow patterns against facial landmark detection, we propose a novel method (i.e., adversarial shadow attack), which allows us to construct a challenging subset of the benchmark for a comprehensive analysis. With the constructed benchmark, we conduct extensive analysis on three state-of-the-art shadow removal methods and three landmark detectors. We observed a highly positive correlation between shadow removal and facial landmark detection tasks, which probably will provide insight to improve the robustness of the facial landmark detection in the future.
引用
收藏
页码:265 / 271
页数:7
相关论文
共 50 条
  • [1] Facial shadow removal
    Smith, William A. P.
    Hancock, Edwin R.
    STRUCTURAL, SYNTACTIC, AND STATISTICAL PATTERN RECOGNITION, PROCEEDINGS, 2006, 4109 : 569 - 577
  • [2] Summary on Facial Landmark Detection
    Wen, Jinghao
    PROCEEDINGS OF THE 2017 2ND INTERNATIONAL CONFERENCE ON MACHINERY, ELECTRONICS AND CONTROL SIMULATION (MECS 2017), 2017, 138 : 253 - 259
  • [3] Facial Landmark Detection and Tracking for Facial Behavior Analysis
    Wu, Yue
    ICMR'16: PROCEEDINGS OF THE 2016 ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, 2016, : 431 - 434
  • [4] A Comparison of Facial Landmark Detection Methods
    Sandikci, Esra Nur
    Eroglu Erdem, Cigdem
    Ulukaya, Sezer
    2018 26TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2018,
  • [5] Facial Landmark Detection: A Literature Survey
    Wu, Yue
    Ji, Qiang
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2019, 127 (02) : 115 - 142
  • [6] Facial Landmark Detection: A Literature Survey
    Yue Wu
    Qiang Ji
    International Journal of Computer Vision, 2019, 127 : 115 - 142
  • [7] Facial Landmark Configuration for Improved Detection
    Huang, C.
    Efraty, B. A.
    Kurkure, U.
    Papadakis, M.
    Shah, S. K.
    Kakadiaris, I. A.
    2012 IEEE INTERNATIONAL WORKSHOP ON INFORMATION FORENSICS AND SECURITY (WIFS), 2012, : 13 - 18
  • [8] Feature fusion for facial landmark detection
    Perakis, Panagiotis
    Theoharis, Theoharis
    Kakadiaris, Ioannis A.
    PATTERN RECOGNITION, 2014, 47 (09) : 2783 - 2793
  • [9] Analysis of Methods for Facial Landmark Detection
    Liu, Shiyi
    2017 2ND INTERNATIONAL CONFERENCE ON MECHATRONICS AND INFORMATION TECHNOLOGY (ICMIT 2017), 2017, : 338 - 342
  • [10] Review of Research on Facial Landmark Detection
    Zhang, Xiaohang
    Tian, Qichuan
    Lian, Lu
    Tan, Run
    Computer Engineering and Applications, 2024, 60 (12) : 48 - 60