Revisiting Bayesian Autoencoders With MCMC

被引:10
|
作者
Chandra, Rohitash [1 ,3 ]
Jain, Mahir [2 ]
Maharana, Manavendra [2 ]
Krivitsky, Pavel N. [1 ,3 ]
机构
[1] UNSW Sydney, Sch Math & Stat, Transit Artificial Intelligence Res Grp, Sydney, NSW 2052, Australia
[2] Manipal Inst Technol, Dept Comp & Commun, Manipal 576104, Karnataka, India
[3] UNSW Sydney, UNSW Data Sci Hub, Sydney, NSW 2052, Australia
来源
IEEE ACCESS | 2022年 / 10卷
关键词
Bayes methods; Neural networks; Proposals; Deep learning; Decoding; Data models; Computational modeling; Bayesian deep learning; MCMC; Langevin dynamics; autoencoders; parallel tempering; deep learning; AUTO-ENCODERS; FACE RECOGNITION; NETWORKS; ROBUST;
D O I
10.1109/ACCESS.2022.3163270
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Autoencoders gained popularity in the deep learning revolution given their ability to compress data and provide dimensionality reduction. Although prominent deep learning methods have been used to enhance autoencoders, the need to provide robust uncertainty quantification remains a challenge. This has been addressed with variational autoencoders so far. Bayesian inference via Markov Chain Monte Carlo (MCMC) sampling has faced several limitations for large models; however, recent advances in parallel computing and advanced proposal schemes have opened routes less traveled. This paper presents Bayesian autoencoders powered by MCMC sampling implemented using parallel computing and Langevin-gradient proposal distribution. The results indicate that the proposed Bayesian autoencoder provides similar performance accuracy when compared to related methods in the literature. Furthermore, it provides uncertainty quantification in the reduced data representation. This motivates further applications of the Bayesian autoencoder framework for other deep learning models.
引用
收藏
页码:40482 / 40495
页数:14
相关论文
共 50 条
  • [1] Alleviating Adversarial Attacks on Variational Autoencoders with MCMC
    Kuzina, Anna
    Welling, Max
    Tomczak, Jakub M.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [2] Learning variational autoencoders via MCMC speed measures
    Hirt, Marcel
    Kreouzis, Vasileios
    Dellaportas, Petros
    [J]. STATISTICS AND COMPUTING, 2024, 34 (05)
  • [3] Approximate Bayesian computation and MCMC
    Plagnol, V
    Tavaré, S
    [J]. MONTE CARLO AND QUASI-MONTE CARLO METHODS 2002, 2004, : 99 - 113
  • [4] REVISITING ROLE OF AUTOENCODERS IN ADVERSARIAL SETTINGS
    Kim, Byeong Cheon
    Kim, Jung Uk
    Lee, Hakmin
    Ro, Yong Man
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 1856 - 1860
  • [5] Model Selection for Bayesian Autoencoders
    Tran, Ba-Hien
    Rossi, Simone
    Milios, Dimitrios
    Michiardi, Pietro
    Bonilla, Edwin, V
    Filippone, Maurizio
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [6] Sequential MCMC for Bayesian model selection
    Andrieu, C
    De Freitas, N
    Doucet, A
    [J]. PROCEEDINGS OF THE IEEE SIGNAL PROCESSING WORKSHOP ON HIGHER-ORDER STATISTICS, 1999, : 130 - 134
  • [7] Particle MCMC for Bayesian Microwave Control
    Minvielle, P.
    Todeschini, A.
    Caron, F.
    Del Moral, P.
    [J]. 4TH INTERNATIONAL WORKSHOP ON NEW COMPUTATIONAL METHODS FOR INVERSE PROBLEMS (NCMIP2014), 2014, 542
  • [8] Bayesian MCMC estimation of the rose of directions
    Prokesová, M
    [J]. KYBERNETIKA, 2003, 39 (06) : 703 - 717
  • [9] Nonlinear MCMC for Bayesian Machine Learning
    Vuckovic, James
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [10] MCMC Bayesian Estimation in FIEGARCH Models
    Prass, Taiane S.
    Lopes, Silvia R. C.
    Achcar, Jorge A.
    [J]. COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2016, 45 (09) : 3238 - 3258