Invertible Residual Networks

被引:0
|
作者
Behrmann, Jens [1 ,2 ,3 ]
Grathwohl, Will [2 ,3 ]
Chen, Ricky T. Q. [2 ,3 ]
Duvenaud, David [2 ,3 ]
Jacobsen, Joern-Henrik [2 ,3 ]
机构
[1] Univ Bremen, Ctr Ind Math, Bremen, Germany
[2] Vector Inst, Toronto, ON, Canada
[3] Univ Toronto, Toronto, ON, Canada
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We show that standard ResNet architectures can be made invertible, allowing the same model to be used for classification, density estimation, and generation. Typically, enforcing invertibility requires partitioning dimensions or restricting network architectures. In contrast, our approach only requires adding a simple normalization step during training, already available in standard frameworks. Invertible ResNets define a generative model which can be trained by maximum likelihood on unlabeled data. To compute likelihoods, we introduce a tractable approximation to the Jacobian log-determinant of a residual block. Our empirical evaluation shows that invertible ResNets perform competitively with both state-of-the-art image classifiers and flow-based generative models, something that has not been previously achieved with a single architecture.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Invertible Residual Blocks in Deep Learning Networks
    Wang, Ruhua
    An, Senjian
    Liu, Wanquan
    Li, Ling
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (07) : 10167 - 10173
  • [2] Invertible residual networks in the context of regularization theory for linear inverse problems
    Arndt, Clemens
    Denker, Alexander
    Dittmer, Soeren
    Heilenkoetter, Nick
    Iske, Meira
    Kluth, Tobias
    Maass, Peter
    Nickel, Judith
    [J]. INVERSE PROBLEMS, 2023, 39 (12)
  • [3] Bayesian view on the training of invertible residual networks for solving linear inverse problems
    Arndt, Clemens
    Dittmer, Soeren
    Heilenkoetter, Nick
    Iske, Meira
    Kluth, Tobias
    Nickel, Judith
    [J]. INVERSE PROBLEMS, 2024, 40 (04)
  • [4] Invertible Residual Neural Networks with Conditional Injector and Interpolator for Point Cloud Upsampling
    Mao, Aihua
    Duan, Yaqi
    Wen, Yu-Hui
    Du, Zihui
    Cai, Hongmin
    Liu, Yong-Jin
    [J]. PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 1267 - +
  • [5] Residual Flows for Invertible Generative Modeling
    Chen, Ricky T. Q.
    Behrmann, Jens
    Duvenaud, David
    Jacobsen, Joern-Henrik
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [6] Residual Networks of Residual Networks: Multilevel Residual Networks
    Zhang, Ke
    Sun, Miao
    Han, Tony X.
    Yuan, Xingfang
    Guo, Liru
    Liu, Tao
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2018, 28 (06) : 1303 - 1314
  • [7] Measuring QCD splittings with invertible networks
    Bieringer, Sebastian
    Butter, Anja
    Heimel, Theo
    Hoeche, Stefan
    Koethe, Ullrich
    Plehn, Tilman
    Radev, Stefan T.
    [J]. SCIPOST PHYSICS, 2021, 10 (06):
  • [8] Generative invertible quantum neural networks
    Rousselot, Armand
    Spannowsky, Michael
    [J]. SCIPOST PHYSICS, 2024, 16 (06):
  • [9] Invertible Neural Networks for Graph Prediction
    Xu, Chen
    Cheng, Xiuyuan
    Xie, Yao
    [J]. IEEE Journal on Selected Areas in Information Theory, 2022, 3 (03): : 454 - 467
  • [10] Invertible Neural Networks for Airfoil Design
    Glaws, Andrew
    King, Ryan N.
    Vijayakumar, Ganesh
    Ananthan, Shreyas
    [J]. AIAA JOURNAL, 2022, 60 (05) : 3035 - 3047