Deep Neural Networks with Efficient Guaranteed Invariances

被引:0
|
作者
Rath, Matthias [1 ,2 ]
Condurache, Alexandru Paul [1 ,2 ]
机构
[1] Robert Bosch GmbH, Cross Domain Comp Solut, Stuttgart, Germany
[2] Univ Lubeck, Inst Signal Proc, Lubeck, Germany
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We address the problem of improving the performance and in particular the sample complexity of deep neural networks by enforcing and guaranteeing invariances to symmetry transformations rather than learning them from data. Group-equivariant convolutions are a popular approach to obtain equivariant representations. The desired corresponding invariance is then imposed using pooling operations. For rotations, it has been shown that using invariant integration instead of pooling further improves the sample complexity. In this contribution, we first expand invariant integration beyond rotations to flips and scale transformations. We then address the problem of incorporating multiple desired invariances into a single network. For this purpose, we propose a multi-stream architecture, where each stream is invariant to a different transformation such that the network can simultaneously benefit from multiple invariances. We demonstrate our approach with successful experiments on Scaled-MNIST, SVHN, CIFAR-10 and STL-10.
引用
收藏
页数:21
相关论文
共 50 条
  • [41] EFFICIENT ARABIC EMOTION RECOGNITION USING DEEP NEURAL NETWORKS
    Hifny, Yasser
    Ali, Ahmed
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 6710 - 6714
  • [42] Flash Memory Array for Efficient Implementation of Deep Neural Networks
    Han, Runze
    Xiang, Yachen
    Huang, Peng
    Shan, Yihao
    Liu, Xiaoyan
    Kang, Jinfeng
    ADVANCED INTELLIGENT SYSTEMS, 2021, 3 (05)
  • [43] ACCURATE AND EFFICIENT FIXED POINT INFERENCE FOR DEEP NEURAL NETWORKS
    Rajagopal, Vasanthakumar
    Ramasamy, Chandra Kumar
    Vishnoi, Ashok
    Gadde, Raj Narayana
    Miniskar, Narasinga Rao
    Pasupuleti, Sirish Kumar
    2018 25TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2018, : 1847 - 1851
  • [44] COMO: Efficient Deep Neural Networks Expansion With COnvolutional MaxOut
    Zhao, Baoxin
    Xiong, Haoyi
    Bian, Jiang
    Guo, Zhishan
    Xu, Cheng-Zhong
    Dou, Dejing
    IEEE TRANSACTIONS ON MULTIMEDIA, 2021, 23 : 1722 - 1730
  • [45] Hardware Efficient Convolution Processing Unit for Deep Neural Networks
    Hazarika, Anakhi
    Poddar, Soumyajit
    Rahaman, Hafizur
    2019 2ND INTERNATIONAL SYMPOSIUM ON DEVICES, CIRCUITS AND SYSTEMS (ISDCS 2019), 2019,
  • [46] An Efficient Asymmetric Nonlinear Activation Function for Deep Neural Networks
    Chai, Enhui
    Yu, Wei
    Cui, Tianxiang
    Ren, Jianfeng
    Ding, Shusheng
    SYMMETRY-BASEL, 2022, 14 (05):
  • [47] Gradient Amplification: An Efficient Way to Train Deep Neural Networks
    Sunitha Basodi
    Chunyan Ji
    Haiping Zhang
    Yi Pan
    Big Data Mining and Analytics, 2020, (03) : 196 - 207
  • [48] An efficient test method for noise robustness of deep neural networks
    Yasuda, Muneki
    Sakata, Hironori
    Cho, Seung-Il
    Harada, Tomochika
    Tanaka, Atushi
    Yokoyama, Michio
    IEICE NONLINEAR THEORY AND ITS APPLICATIONS, 2019, 10 (02): : 221 - 235
  • [49] Efficient and Accurate Estimation of Lipschitz Constants for Deep Neural Networks
    Fazlyab, Mahyar
    Robey, Alexander
    Hassani, Hamed
    Morari, Manfred
    Pappas, George J.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [50] Reluplex: An Efficient SMT Solver for Verifying Deep Neural Networks
    Katz, Guy
    Barrett, Clark
    Dill, David L.
    Julian, Kyle
    Kochenderfer, Mykel J.
    COMPUTER AIDED VERIFICATION, CAV 2017, PT I, 2017, 10426 : 97 - 117