Explicit Filterbank Learning for Neural Image Style Transfer and Image Processing

被引:17
|
作者
Chen, Dongdong [1 ]
Yuan, Lu [2 ]
Liao, Jing [3 ]
Yu, Nenghai [1 ]
Hua, Gang [4 ]
机构
[1] Univ Sci & Technol China, Dept Elect Engn & Informat Sci, Hefei 230026, Anhui, Peoples R China
[2] Microsoft Res, Redmond, WA 98052 USA
[3] City Univ Hong Kong, Dept Comp Sci, Kowloon Tong, Hong Kong, Peoples R China
[4] Wormpex Res LLC, Bellevue, WA 98004 USA
基金
国家重点研发计划;
关键词
Task analysis; Convolution; Decoding; Neural networks; Feature extraction; Fuses; Image processing and computer vision; style transfer; TEXTURE SYNTHESIS; MODEL;
D O I
10.1109/TPAMI.2020.2964205
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Image style transfer is to re-render the content of one image with the style of another. Most existing methods couple content and style information in their network structures and hyper-parameters, and learn it as a black-box. For better understanding, this paper aims to provide a new explicit decoupled perspective. Specifically, we propose StyleBank, which is composed of multiple convolution filter banks and each filter bank explicitly represents one style. To transfer an image to a specific style, the corresponding filter bank is operated on the intermediate feature produced by a single auto-encoder. The StyleBank and the auto-encoder are jointly learnt in such a way that the auto-encoder does not encode any style information. This explicit representation also enables us to conduct incremental learning to add a new style and fuse styles at not only the image level, but also the region level. Our method is the first style transfer network that links back to traditional texton mapping methods, and provides new understanding on neural style transfer. We further apply this general filterbank learning idea to two different multi-parameter image processing tasks: edge-aware image smoothing and denoising. Experiments demonstrate that it can achieve comparable results to its single parameter setting counterparts.
引用
收藏
页码:2373 / 2387
页数:15
相关论文
共 50 条
  • [31] Robust Nonparametric Distribution Transfer with Exposure Correction for Image Neural Style Transfer
    Liu, Shuai
    Hong, Caixia
    He, Jing
    Tian, Zhiqiang
    SENSORS, 2020, 20 (18) : 1 - 19
  • [32] Unbiased Image Style Transfer
    Choi, Hyun-Chul
    IEEE ACCESS, 2020, 8 : 196600 - 196608
  • [33] SEMANTIC IMAGE STYLE TRANSFER
    Stalin, S. Binu
    Judith, J. E.
    Jegan, C. Dhayananth
    2023 ADVANCED COMPUTING AND COMMUNICATION TECHNOLOGIES FOR HIGH PERFORMANCE APPLICATIONS, ACCTHPA, 2023,
  • [34] Learning Linear Transformations for Fast Image and Video Style Transfer
    Li, Xueting
    Liu, Sifei
    Kautz, Jan
    Yang, Ming-Hsuan
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 3804 - 3812
  • [35] Deep Learning-Based Application of Image Style Transfer
    Liao, Yimi
    Huang, Youfu
    Mathematical Problems in Engineering, 2022, 2022
  • [36] Advanced deep learning techniques for image style transfer: A survey
    Liu, Long
    Xi, Zhixuan
    Ji, RuiRui
    Ma, Weigang
    SIGNAL PROCESSING-IMAGE COMMUNICATION, 2019, 78 : 465 - 470
  • [37] Deep Learning-Based Application of Image Style Transfer
    Liao, YiMi
    Huang, YouFu
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2022, 2022
  • [38] End-to-end learning for arbitrary image style transfer
    Yoon, Y. B.
    Kim, M. S.
    Choi, H. C.
    ELECTRONICS LETTERS, 2018, 54 (22) : 1276 - 1277
  • [39] CAPTCHA Image Generation: Two-Step Style-Transfer Learning in Deep Neural Networks
    Kwon, Hyun
    Yoon, Hyunsoo
    Park, Ki-Woong
    SENSORS, 2020, 20 (05)
  • [40] Neural Style Transfer for image within images and conditional GANs for destylization
    Ubhi, Jagpal Singh
    Aggarwal, Ashwani Kumar
    Mallika, Ashwani Kumar
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2022, 85