Convolutional Networks with Adaptive Inference Graphs

被引:1
|
作者
Andreas Veit
Serge Belongie
机构
[1] Google Research,Department of Computer Science and Cornell Tech
[2] Cornell University,undefined
来源
关键词
Convolutional neural networks; Gumbel-Softmax; Residual networks;
D O I
暂无
中图分类号
学科分类号
摘要
Do convolutional networks really need a fixed feed-forward structure? What if, after identifying the high-level concept of an image, a network could move directly to a layer that can distinguish fine-grained differences? Currently, a network would first need to execute sometimes hundreds of intermediate layers that specialize in unrelated aspects. Ideally, the more a network already knows about an image, the better it should be at deciding which layer to compute next. In this work, we propose convolutional networks with adaptive inference graphs (ConvNet-AIG) that adaptively define their network topology conditioned on the input image. Following a high-level structure similar to residual networks (ResNets), ConvNet-AIG decides for each input image on the fly which layers are needed. In experiments on ImageNet we show that ConvNet-AIG learns distinct inference graphs for different categories. Both ConvNet-AIG with 50 and 101 layers outperform their ResNet counterpart, while using 20%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$20\%$$\end{document} and 38%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$38\%$$\end{document} less computations respectively. By grouping parameters into layers for related classes and only executing relevant layers, ConvNet-AIG improves both efficiency and overall classification quality. Lastly, we also study the effect of adaptive inference graphs on the susceptibility towards adversarial examples. We observe that ConvNet-AIG shows a higher robustness than ResNets, complementing other known defense mechanisms.
引用
收藏
页码:730 / 741
页数:11
相关论文
共 50 条
  • [1] Convolutional Networks with Adaptive Inference Graphs
    Veit, Andreas
    Belongie, Serge
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2020, 128 (03) : 730 - 741
  • [2] Convolutional Networks with Adaptive Inference Graphs
    Veit, Andreas
    Belongie, Serge
    COMPUTER VISION - ECCV 2018, PT I, 2018, 11205 : 3 - 18
  • [3] Graph Convolutional Networks on User Mobility Heterogeneous Graphs for Social Relationship Inference
    Wu, Yongji
    Lian, Defu
    Jin, Shuowei
    Chen, Enhong
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 3898 - 3904
  • [4] Efficient adaptive inference for deep convolutional neural networks using hierarchical early exits
    Passalis, Nikolaos
    Raitoharju, Jenni
    Tefas, Anastasios
    Gabbouj, Moncef
    PATTERN RECOGNITION, 2020, 105
  • [5] Learning Convolutional Neural Networks for Graphs
    Niepert, Mathias
    Ahmed, Mohamed
    Kutzkov, Konstantin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [6] Adaptive Gabor convolutional networks
    Yuan, Ye
    Wang, Li-Na
    Zhong, Guoqiang
    Gao, Wei
    Jiao, Wencong
    Dong, Junyu
    Shen, Biao
    Xia, Dongdong
    Xiang, Wei
    PATTERN RECOGNITION, 2022, 124
  • [7] Location Semantics Inference with Graph Convolutional Networks
    Wu R.-Z.
    Zhu D.-Y.
    Wang C.-Y.
    Qin K.
    Dianzi Keji Daxue Xuebao/Journal of the University of Electronic Science and Technology of China, 2020, 49 (05): : 739 - 744
  • [8] A Hardware Inference Accelerator for Temporal Convolutional Networks
    Ali, Rashid
    Mallah, Maen
    Leyh, Martin
    Holzinger, Philipp
    Breiling, Marco
    Reichenbach, Marc
    Fey, Dietmar
    2019 IEEE NORDIC CIRCUITS AND SYSTEMS CONFERENCE (NORCAS) - NORCHIP AND INTERNATIONAL SYMPOSIUM OF SYSTEM-ON-CHIP (SOC), 2019,
  • [9] Simulating quantized inference on convolutional neural networks
    Finotti, Vitor
    Albertini, Bruno
    COMPUTERS & ELECTRICAL ENGINEERING, 2021, 95
  • [10] xDNN: Inference for Deep Convolutional Neural Networks
    D'Alberto, Paolo
    Wu, Victor
    Ng, Aaron
    Nimaiyar, Rahul
    Delaye, Elliott
    Sirasao, Ashish
    ACM TRANSACTIONS ON RECONFIGURABLE TECHNOLOGY AND SYSTEMS, 2022, 15 (02)