Face Inpainting with Pre-trained Image Transformers

被引:0
|
作者
Gonc, Kaan [1 ]
Saglam, Baturay [2 ]
Kozat, Suleyman S. [2 ]
Dibeklioglu, Hamdi [1 ]
机构
[1] Bilkent Univ, Bilgisayar Muhendisligi Bolumu, Ankara, Turkey
[2] Bilkent Univ, Elekt & Elekt Muhendisligi Bolumu, Ankara, Turkey
关键词
image inpainting; transformers; deep generative models;
D O I
10.1109/SIU55565.2022.9864676
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Image inpainting is an underdetermined inverse problem that allows various contents to fill in the missing or damaged regions realistically. Convolutional neural networks (CNNs) are commonly used to create aesthetically pleasing content, yet CNNs have restricted perception fields for collecting global characteristics. Transformers enable long-range relationships to be modeled and different content generated with autoregressive modeling of pixel-sequence distributions using image-level attention mechanism. However, the current approaches to inpainting with transformers are limited to task-specific datasets and require larger-scale data. We introduce an approach to image inpainting by leveraging pre-trained vision transformers to remedy this issue. Experiments show that our approach can outperform CNN-based approaches and have a remarkable performance closer to the task-specific transformer methods.
引用
收藏
页数:4
相关论文
共 50 条
  • [41] Logical Transformers: Infusing Logical Structures into Pre-Trained Language Models
    Wang, Borui
    Huang, Qiuyuan
    Deb, Budhaditya
    Halfaker, Aaron
    Shao, Liqun
    McDuff, Daniel
    Awadallah, Ahmed Hassan
    Radev, Dragomir
    Gao, Jianfeng
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 1762 - 1773
  • [42] Finding and Editing Multi-Modal Neurons in Pre-Trained Transformers
    Pan, Haowen
    Cao, Yixin
    Wang, Xiaozhi
    Yang, Xun
    Wang, Meng
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: ACL 2024, 2024, : 1012 - 1037
  • [43] Fast and accurate Bayesian optimization with pre-trained transformers for constrained engineering problemsFast and accurate Bayesian optimization with pre-trained transformers...R. Yu et al..
    Cyril Picard
    Faez Ahmed
    Structural and Multidisciplinary Optimization, 2025, 68 (3)
  • [44] PTW: Pivotal TuningWatermarking for Pre-Trained Image Generators
    Lukas, Nils
    Kerschbaum, Florian
    PROCEEDINGS OF THE 32ND USENIX SECURITY SYMPOSIUM, 2023, : 2241 - 2258
  • [45] Underwater Image Enhancement Using Pre-trained Transformer
    Boudiaf, Abderrahmene
    Guo, Yuhang
    Ghimire, Adarsh
    Werghi, Naoufel
    De Masi, Giulia
    Javed, Sajid
    Dias, Jorge
    IMAGE ANALYSIS AND PROCESSING, ICIAP 2022, PT III, 2022, 13233 : 480 - 488
  • [46] Pre-trained SAM as data augmentation for image segmentation
    Wu, Junjun
    Rao, Yunbo
    Zeng, Shaoning
    Zhang, Bob
    CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY, 2025, 10 (01) : 268 - 282
  • [47] Image Hashing by Pre-Trained Deep Neural Network
    Li Pingyuan
    Zhang Dan
    Yuan Xiaoguang
    Jiang Suiping
    2022 ASIA CONFERENCE ON ALGORITHMS, COMPUTING AND MACHINE LEARNING (CACML 2022), 2022, : 468 - 471
  • [48] FEGAN: Flexible and Efficient Face Editing With Pre-Trained Generator
    Ning, Xin
    Xu, Shaohui
    Li, Weijun
    Nie, Shuai
    IEEE ACCESS, 2020, 8 : 65340 - 65350
  • [49] FEGAN: Flexible and Efficient Face Editing with Pre-Trained Generator
    Ning, Xin
    Xu, Shaohui
    Li, Weijun
    Nie, Shuai
    IEEE Access, 2020, 8 : 65340 - 65350
  • [50] ZeroI2V: Zero-Cost Adaptation of Pre-trained Transformers from Image to Video
    Li, Xinhao
    Zhu, Yuhan
    Wang, Limin
    COMPUTER VISION - ECCV 2024, PT LXXXIII, 2025, 15141 : 425 - 443