Dilated Residual Networks

被引:852
|
作者
Yu, Fisher [1 ]
Koltun, Vladlen [2 ]
Funkhouser, Thomas [1 ]
机构
[1] Princeton Univ, Princeton, NJ 08544 USA
[2] Intel Labs, San Francisco, CA USA
基金
美国国家科学基金会;
关键词
D O I
10.1109/CVPR.2017.75
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Convolutional networks for image classification progressively reduce resolution until the image is represented by tiny feature maps in which the spatial structure of the scene is no longer discernible. Such loss of spatial acuity can limit image classification accuracy and complicate the transfer of the model to downstream applications that require detailed scene understanding. These problems can be alleviated by dilation, which increases the resolution of output feature maps without reducing the receptive field of individual neurons. We show that dilated residual networks (DRNs) outperform their non-dilated counterparts in image classification without increasing the model's depth or complexity. We then study gridding artifacts introduced by dilation, develop an approach to removing these artifacts ('degridding'), and show that this further increases the performance of DRNs. In addition, we show that the accuracy advantage of DRNs is further magnified in downstream applications such as object localization and semantic segmentation.
引用
收藏
页码:636 / 644
页数:9
相关论文
共 50 条
  • [31] A multiscale dilated residual network for image denoising
    Dongjie Li
    Huaian Chen
    Guoqiang Jin
    Yi Jin
    Changan Zhu
    Enhong Chen
    Multimedia Tools and Applications, 2020, 79 : 34443 - 34458
  • [32] Dilated Deep Residual Network for Image Denoising
    Wang, Tianyang
    Sun, Mingxuan
    Hu, Kaoning
    2017 IEEE 29TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2017), 2017, : 1272 - 1279
  • [33] A multiscale dilated residual network for image denoising
    Li, Dongjie
    Chen, Huaian
    Jin, Guoqiang
    Jin, Yi
    Zhu, Changan
    Chen, Enhong
    MULTIMEDIA TOOLS AND APPLICATIONS, 2020, 79 (45-46) : 34443 - 34458
  • [34] Dilated residual attention network for load disaggregation
    Xia, Min
    Liu, Wan'an
    Xu, Yiqing
    Wang, Ke
    Zhang, Xu
    NEURAL COMPUTING & APPLICATIONS, 2019, 31 (12): : 8931 - 8953
  • [35] DILATED NETWORKS FOR PHOTONIC SWITCHING
    PADMANABHAN, K
    NETRAVALI, AN
    IEEE TRANSACTIONS ON COMMUNICATIONS, 1987, 35 (12) : 1357 - 1365
  • [36] Dilated Recurrent Neural Networks
    Chang, Shiyu
    Zhang, Yang
    Han, Wei
    Yu, Mo
    Guo, Xiaoxiao
    Tan, Wei
    Cui, Xiaodong
    Witbrock, Michael
    Hasegawa-Johnson, Mark
    Huang, Thomas S.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [37] LEMON: A Lightweight Facial Emotion Recognition System for Assistive Robotics Based on Dilated Residual Convolutional Neural Networks
    Devaram, Rami Reddy
    Beraldo, Gloria
    De Benedictis, Riccardo
    Mongiovi, Misael
    Cesta, Amedeo
    SENSORS, 2022, 22 (09)
  • [38] RESIDUAL DILATED NETWORK WITH ATTENTION FOR IMAGE BLIND DENOISING
    Hou, Guanqun
    Yang, Yujiu
    Xue, Jing-Hao
    2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2019, : 248 - 253
  • [39] Optimization Method of Residual Networks of Residual Networks for Image Classification
    Zhang, Ke
    Guo, Liru
    Gao, Ce
    2018 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP), 2018, : 321 - 325
  • [40] Optimization Method of Residual Networks of Residual Networks for Image Classification
    Lin, Long
    Yuan, Hao
    Guo, Liru
    Kuang, Yingqun
    Zhang, Ke
    INTELLIGENT COMPUTING METHODOLOGIES, ICIC 2018, PT III, 2018, 10956 : 212 - 222