One-for-All: An Efficient Variable Convolution Neural Network for In-Loop Filter of VVC

被引:26
|
作者
Huang, Zhijie [1 ]
Sun, Jun [1 ]
Guo, Xiaopeng [1 ]
Shang, Mingyu [1 ]
机构
[1] Peking Univ, Wangxuan Inst Comp Technol, Beijing 100871, Peoples R China
关键词
Encoding; Videos; Feature extraction; Convolution; Adaptation models; Visualization; Training; Variable; in-loop filter; attention; versatile video coding (VVC); CNN;
D O I
10.1109/TCSVT.2021.3089498
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Recently, many researches on convolution neural network (CNN) based in-loop filters have been proposed to improve coding efficiency. However, most existing CNN based filters tend to train and deploy multiple networks for various quantization parameters (QP) and frame types (FT), which drastically increases resources in training these models and the memory burdens for video codec. In this paper, we propose a novel variable CNN (VCNN) based in-loop filter for VVC, which can effectively handle the compressed videos with different QPs and FTs via a single model. Specifically, an efficient and flexible attention module is developed to recalibrate features according to QPs or FTs. Then we embed the module into the residual block so that these informative features can be continuously utilized in the residual learning process. To minimize the information loss in the learning process of the entire network, we utilize a residual feature aggregation module (RFA) for more efficient feature extraction. Based on it, an efficient network architecture VCNN is designed that can not only effectively reduce compression artifacts, but also can be adaptive to various QPs and FTs. To address training data imbalance on various QPs and FTs and improve the robustness of the model, a focal mean square error loss function is employed to train the proposed network. Then we integrate the VCNN into VVC as an additional tool of in-loop filters after the deblocking filter. Extensive experimental results show that our VCNN approach obtains on average 3.63%, 4.36%, 4.23%, 3.56% under all intra, low-delay P, low-delay, and random access configurations, respectively, which is even better than QP-Separate models.
引用
收藏
页码:2342 / 2355
页数:14
相关论文
共 50 条
  • [1] In-Loop Filter with Dense Residual Convolutional Neural Network for VVC
    Chen, Sijia
    Chen, Zhenzhong
    Wang, Yingbin
    Liu, Shan
    THIRD INTERNATIONAL CONFERENCE ON MULTIMEDIA INFORMATION PROCESSING AND RETRIEVAL (MIPR 2020), 2020, : 153 - 156
  • [2] CONVOLUTIONAL NEURAL NETWORK BASED IN-LOOP FILTER FOR VVC INTRA CODING
    Li, Yue
    Zhang, Li
    Zhang, Kai
    2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 2104 - 2108
  • [3] Low Complexity In-Loop Filter for VVC Based on Convolution and Transformer
    Feng, Zhen
    Jung, Cheolkon
    Zhang, Hao
    Liu, Yang
    Li, Ming
    IEEE ACCESS, 2024, 12 : 120316 - 120325
  • [4] MULTI-GRADIENT CONVOLUTIONAL NEURAL NETWORK BASED IN-LOOP FILTER FOR VVC
    Huang, Zhijie
    Li, Yunchang
    Sun, Jun
    2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2020,
  • [5] Residual-guided In-loop Filter Using Convolution Neural Network
    Jia, Wei
    Li, Li
    Li, Zhu
    Zhang, Xiang
    Liu, Shan
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2021, 17 (04)
  • [6] An Efficient QP Variable Convolutional Neural Network Based In-loop Filter for Intra Coding
    Huang, Zhijie
    Guo, Xiaopeng
    Shang, Mingyu
    Gao, Jie
    Sun, Jun
    2021 DATA COMPRESSION CONFERENCE (DCC 2021), 2021, : 33 - 42
  • [7] RTNN: A Neural Network-Based In-Loop Filter in VVC Using Resblock and Transformer
    Zhang, Hao
    Liu, Yunfeng
    Jung, Cheolkon
    Liu, Yang
    Li, Ming
    IEEE ACCESS, 2024, 12 : 104599 - 104610
  • [8] VVC In-Loop Filtering Based on Deep Convolutional Neural Network
    Bouaafia, Soulef
    Messaoud, Seifeddine
    Khemiri, Randa
    Sayadi, Fatma Elzahra
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2021, 2021
  • [9] VVC In-Loop Filtering Based on Deep Convolutional Neural Network
    Bouaafia, Soulef
    Messaoud, Seifeddine
    Khemiri, Randa
    Sayadi, Fatma Elzahra
    Computational Intelligence and Neuroscience, 2021, 2021
  • [10] Dense Inception Attention Neural Network for In-Loop Filter
    Xu, Xiaoyu
    Qian, Jian
    Yu, Li
    Wang, Hongkui
    Zeng, Xing
    Li, Zhengang
    Wang, Ning
    2019 PICTURE CODING SYMPOSIUM (PCS), 2019,