Face Inpainting with Pre-trained Image Transformers

被引:0
|
作者
Gonc, Kaan [1 ]
Saglam, Baturay [2 ]
Kozat, Suleyman S. [2 ]
Dibeklioglu, Hamdi [1 ]
机构
[1] Bilkent Univ, Bilgisayar Muhendisligi Bolumu, Ankara, Turkey
[2] Bilkent Univ, Elekt & Elekt Muhendisligi Bolumu, Ankara, Turkey
关键词
image inpainting; transformers; deep generative models;
D O I
10.1109/SIU55565.2022.9864676
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Image inpainting is an underdetermined inverse problem that allows various contents to fill in the missing or damaged regions realistically. Convolutional neural networks (CNNs) are commonly used to create aesthetically pleasing content, yet CNNs have restricted perception fields for collecting global characteristics. Transformers enable long-range relationships to be modeled and different content generated with autoregressive modeling of pixel-sequence distributions using image-level attention mechanism. However, the current approaches to inpainting with transformers are limited to task-specific datasets and require larger-scale data. We introduce an approach to image inpainting by leveraging pre-trained vision transformers to remedy this issue. Experiments show that our approach can outperform CNN-based approaches and have a remarkable performance closer to the task-specific transformer methods.
引用
收藏
页数:4
相关论文
共 50 条
  • [31] Unravelling the Effect of Image Distortions for Biased Prediction of Pre-trained Face Recognition Models
    Majumdar, Puspita
    Mittal, Surbhi
    Singh, Richa
    Vatsa, Mayank
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW 2021), 2021, : 3779 - 3788
  • [32] An Application of pre-Trained CNN for Image Classification
    Abdullah
    Hasan, Mohammad S.
    2017 20TH INTERNATIONAL CONFERENCE OF COMPUTER AND INFORMATION TECHNOLOGY (ICCIT), 2017,
  • [33] SiamTrans: Zero-Shot Multi-Frame Image Restoration with Pre-trained Siamese Transformers
    Liu, Lin
    Yuan, Shanxin
    Liu, Jianzhuang
    Guo, Xin
    Yan, Youliang
    Tian, Qi
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 1747 - 1755
  • [34] NODULE DETECTION IN CHEST RADIOGRAPHS WITH UNSUPERVISED PRE-TRAINED DETECTION TRANSFORMERS
    Behrendt, Finn
    Bhattacharya, Debayan
    Krueger, Julia
    Opfer, Roland
    Schlaefer, Alexander
    2023 IEEE 20TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING, ISBI, 2023,
  • [35] Word Representation Learning in Multimodal Pre-Trained Transformers: An Intrinsic Evaluation
    Pezzelle, Sandro
    Takmaz, Ece
    Fernandez, Raquel
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2021, 9 : 1563 - 1579
  • [36] Unlocking Pre-trained Image Backbones for Semantic Image Synthesis
    Berrada, Tariq
    Verbeek, Jakob
    Couprie, Camille
    Alahari, Karteek
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2024, 2024, : 7840 - 7849
  • [37] Do Syntax Trees Help Pre-trained Transformers Extract Information?
    Sachan, Devendra Singh
    Zhang, Yuhao
    Qi, Peng
    Hamilton, William
    16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 2647 - 2661
  • [38] Unsupervised Out-of-Domain Detection via Pre-trained Transformers
    Xu, Keyang
    Ren, Tongzheng
    Zhang, Shikun
    Feng, Yihao
    Xiong, Caiming
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 1052 - 1061
  • [39] On Checking Robustness on Named Entity Recognition with Pre-trained Transformers Models
    Garcia-Pablos, Aitor
    Mandravickaite, Justina
    Versinskiene, Egidija
    BALTIC JOURNAL OF MODERN COMPUTING, 2023, 11 (04): : 591 - 606
  • [40] Syntax-BERT: Improving Pre-trained Transformers with Syntax Trees
    Bai, Jiangang
    Wang, Yujing
    Chen, Yiren
    Yang, Yaming
    Bai, Jing
    Yu, Jing
    Tong, Yunhai
    16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 3011 - 3020