Face inpainting based on high-level facial attributes

被引:16
|
作者
Jampour, Mandi [1 ,7 ]
Li, Chen [4 ,6 ]
Yu, Lap-Fai [8 ]
Zhou, Kun [5 ]
Lin, Stephen [6 ]
Bischof, Horst [2 ,3 ]
机构
[1] Graz Univ Technol, Graz, Austria
[2] Graz Univ Technol, Res, Graz, Austria
[3] Graz Univ Technol, Inst Comp Graph & Vis, Graz, Austria
[4] Zhejiang Univ, State Key Lab CAD&CG, Hangzhou, Zhejiang, Peoples R China
[5] Zhejiang Univ, Comp Sci Dept, Hangzhou, Zhejiang, Peoples R China
[6] Microsoft Res, Redmond, WA USA
[7] Univ Massachusetts Boston, Boston, MA 02125 USA
[8] Univ Massachusetts Boston, Graph & Virtual Environm Lab, Boston, MA USA
基金
美国国家科学基金会;
关键词
Face inpainting; Face analysis; Facial attributes; IMAGE COMPLETION; RECOGNITION; REMOVAL;
D O I
10.1016/j.cviu.2017.05.008
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We introduce a novel data-driven approach for face inpainting, which makes use of the observable region of an occluded face as well as its inferred high-level facial attributes, namely gender, ethnicity, and expression. Based on the intuition that the realism of a face inpainting result depends significantly on its overall consistency with respect to these high-level attributes, our approach selects a guidance face that matches the targeted attributes and utilizes it together with the observable input face regions to inpaint the missing areas. These two sources of information are balanced using an adaptive optimization, and the inpainting is performed on the intrinsic image layers instead of the RGB color space to handle the illumination differences between the target face and the guidance face to further enhance the resulting visual quality. Our experiments demonstrate that this approach is effective in inpainting facial components such as the mouth or the eyes that could be partially or completely occluded in the input face. A perceptual study shows that our approach generates more natural facial appearances by accounting for high-level facial attributes. (C) 2017 Elsevier Inc. All rights reserved.
引用
收藏
页码:29 / 41
页数:13
相关论文
共 50 条
  • [1] LSerial dependence of facial identity reflects high-level face coding
    Turbett, Kaitlyn
    Palermo, Romina
    Bell, Jason
    Hanran-Smith, Dewi Anna
    Jeffery, Linda
    [J]. VISION RESEARCH, 2021, 182 : 9 - 19
  • [2] Extracting Facial Features and Face Inpainting
    Yu, Lin Chun
    Tang, Nick C.
    Jun, Huang Bo
    Shih, Timothy K.
    [J]. ADVANCES IN MULTIMEDIA INFORMATION PROCESSING - PCM 2008, 9TH PACIFIC RIM CONFERENCE ON MULTIMEDIA, 2008, 5353 : 863 - 866
  • [3] Mask removal : Face inpainting via attributes
    Jiang, Yefan
    Yang, Fan
    Bian, Zhangxing
    Lu, Changsheng
    Xia, Siyu
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (21) : 29785 - 29797
  • [4] Mask removal : Face inpainting via attributes
    Yefan Jiang
    Fan Yang
    Zhangxing Bian
    Changsheng Lu
    Siyu Xia
    [J]. Multimedia Tools and Applications, 2022, 81 : 29785 - 29797
  • [5] Face inpainting based on GAN by facial prediction and fusion as guidance information
    Zhang, Xian
    Shi, Canghong
    Wang, Xin
    Wu, Xi
    Li, Xiaojie
    Lv, Jiancheng
    Mumtaz, Imran
    [J]. APPLIED SOFT COMPUTING, 2021, 111 (111)
  • [6] High-level attributes modeling for indoor scenes classification
    Wang, Chaojie
    Yu, Jun
    Tao, Dapeng
    [J]. NEUROCOMPUTING, 2013, 121 : 337 - 343
  • [7] A high-level electrical energy ontology with weighted attributes
    Kucuk, Dilek
    [J]. ADVANCED ENGINEERING INFORMATICS, 2015, 29 (03) : 513 - 522
  • [8] High-Level Face Adaptation Without Awareness
    Adams, Wendy J.
    Gray, Katie L. H.
    Garner, Matthew
    Graf, Erich W.
    [J]. PSYCHOLOGICAL SCIENCE, 2010, 21 (02) : 205 - 210
  • [9] Face tracking with low-level and high-level information
    Xu, D
    Li, S
    Liu, ZK
    [J]. CHINESE JOURNAL OF ELECTRONICS, 2005, 14 (01) : 99 - 102
  • [10] How do Facial Parts Contribute to Expression Perception? An Answer from the High-level Face Adaptation
    Song, Miao
    Shinomori, Keizo
    Zhang, Shiyong
    [J]. INFORMATION-AN INTERNATIONAL INTERDISCIPLINARY JOURNAL, 2010, 13 (06): : 1947 - 1956