Face Inpainting with Pre-trained Image Transformers

被引:0
|
作者
Gonc, Kaan [1 ]
Saglam, Baturay [2 ]
Kozat, Suleyman S. [2 ]
Dibeklioglu, Hamdi [1 ]
机构
[1] Bilkent Univ, Bilgisayar Muhendisligi Bolumu, Ankara, Turkey
[2] Bilkent Univ, Elekt & Elekt Muhendisligi Bolumu, Ankara, Turkey
关键词
image inpainting; transformers; deep generative models;
D O I
10.1109/SIU55565.2022.9864676
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Image inpainting is an underdetermined inverse problem that allows various contents to fill in the missing or damaged regions realistically. Convolutional neural networks (CNNs) are commonly used to create aesthetically pleasing content, yet CNNs have restricted perception fields for collecting global characteristics. Transformers enable long-range relationships to be modeled and different content generated with autoregressive modeling of pixel-sequence distributions using image-level attention mechanism. However, the current approaches to inpainting with transformers are limited to task-specific datasets and require larger-scale data. We introduce an approach to image inpainting by leveraging pre-trained vision transformers to remedy this issue. Experiments show that our approach can outperform CNN-based approaches and have a remarkable performance closer to the task-specific transformer methods.
引用
收藏
页数:4
相关论文
共 50 条
  • [21] DeFormer: Decomposing Pre-trained Transformers for Faster Question Answering
    Cao, Qingqing
    Trivedi, Harsh
    Balasubramanian, Aruna
    Balasubramanian, Niranjan
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 4487 - 4497
  • [22] Routing Generative Pre-Trained Transformers for Printed Circuit Board
    Wang, Hao
    Tu, Jun
    Bai, Shenglong
    Zheng, Jie
    Qian, Weikang
    Chen, Jienan
    2024 INTERNATIONAL SYMPOSIUM OF ELECTRONICS DESIGN AUTOMATION, ISEDA 2024, 2024, : 160 - 165
  • [23] Towards Summarizing Code Snippets Using Pre-Trained Transformers
    Mastropaolo, Antonio
    Tufano, Rosalia
    Ciniselli, Matteo
    Aghajani, Emad
    Pascarella, Luca
    Bavota, Gabriele
    arXiv, 1600,
  • [24] Investor's ESG tendency probed by pre-trained transformers
    Li, Chao
    Keeley, Alexander Ryota
    Takeda, Shutaro
    Seki, Daikichi
    Managi, Shunsuke
    CORPORATE SOCIAL RESPONSIBILITY AND ENVIRONMENTAL MANAGEMENT, 2025, 32 (02) : 2051 - 2071
  • [25] TWilBert: Pre-trained deep bidirectional transformers for Spanish Twitter
    Gonzalez, Jose Angel
    Hurtado, Lluis-F.
    Pla, Ferran
    NEUROCOMPUTING, 2021, 426 : 58 - 69
  • [26] Causal Interpretation of Self-Attention in Pre-Trained Transformers
    Rohekar, Raanan Y.
    Gurwicz, Yaniv
    Nisimov, Shami
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [27] An Empirical Study of Pre-trained Transformers for Arabic Information Extraction
    Lan, Wuwei
    Chen, Yang
    Xu, Wei
    Ritter, Alan
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 4727 - 4734
  • [28] Handwritten Document Recognition Using Pre-trained Vision Transformers
    Parres, Daniel
    Anitei, Dan
    Paredes, Roberto
    DOCUMENT ANALYSIS AND RECOGNITION-ICDAR 2024, PT II, 2024, 14805 : 173 - 190
  • [29] Experiments in News Bias Detection with Pre-trained Neural Transformers
    Menzner, Tim
    Leidner, Jochen L.
    ADVANCES IN INFORMATION RETRIEVAL, ECIR 2024, PT IV, 2024, 14611 : 270 - 284
  • [30] Emotion Recognition with Pre-Trained Transformers Using Multimodal Signals
    Vazquez-Rodriguez, Juan
    Lefebvre, Gregoire
    Cumin, Julien
    Crowley, James L.
    2022 10TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII), 2022,