Large Scale Bayesian Inference and Experimental Design for Sparse Linear Models

被引:39
|
作者
Seeger, Matthias W. [1 ]
Nickisch, Hannes [2 ]
机构
[1] Ecole Polytech Fed Lausanne, Sch Comp & Commun Sci, CH-1015 Lausanne, Switzerland
[2] Max Planck Inst Biol Cybernet, D-72076 Tubingen, Germany
来源
SIAM JOURNAL ON IMAGING SCIENCES | 2011年 / 4卷 / 01期
关键词
sparse linear model; sparsity prior; experimental design; sampling optimization; image acquisition; variational approximate inference; Bayesian statistics; compressive sensing; sparse reconstruction; magnetic resonance imaging; MAXIMUM-LIKELIHOOD; SELECTION; ALGORITHMS; REGRESSION; ROBUST;
D O I
10.1137/090758775
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many problems of low-level computer vision and image processing, such as denoising, deconvolution, tomographic reconstruction or superresolution, can be addressed by maximizing the posterior distribution of a sparse linear model (SLM). We show how higher-order Bayesian decision-making problems, such as optimizing image acquisition in magnetic resonance scanners, can be addressed by querying the SLM posterior covariance, unrelated to the density's mode. We propose a scalable algorithmic framework, with which SLM posteriors over full, high-resolution images can be approximated for the first time, solving a variational optimization problem which is convex if and only if posterior mode finding is convex. These methods successfully drive the optimization of sampling trajectories for real-world magnetic resonance imaging through Bayesian experimental design, which has not been attempted before. Our methodology provides new insight into similarities and differences between sparse reconstruction and approximate Bayesian inference, and has important implications for compressive sensing of real-world images. Parts of this work have been presented at conferences [M. Seeger, H. Nickisch, R. Pohmann, and B. Scholkopf, in Advances in Neural Information Processing Systems 21, D. Koller, D. Schuurmans, Y. Bengio, and L. Bottou, eds., Curran Associates, Red Hook, NY, 2009, pp. 1441-1448; H. Nickisch and M. Seeger, in Proceedings of the 26th International Conference on Machine Learning, L. Bottou and M. Littman, eds., Omni Press, Madison, WI, 2009, pp. 761-768].
引用
收藏
页码:166 / 199
页数:34
相关论文
共 50 条
  • [1] Sparse linear models: Variational approximate inference and Bayesian experimental design
    Seeger, Matthias W.
    [J]. INTERNATIONAL WORKSHOP ON STATISTICAL-MECHANICAL INFORMATICS 2009 (IW-SMI 2009), 2009, 197
  • [2] Bayesian inference for sparse generalized linear models
    Seeger, Matthias
    Gerwinn, Sebastian
    Bethge, Matthias
    [J]. MACHINE LEARNING: ECML 2007, PROCEEDINGS, 2007, 4701 : 298 - +
  • [3] Bayesian inference and optimal design for the sparse linear model
    Seeger, Matthias W.
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2008, 9 : 759 - 813
  • [4] A sparse matrix approach to Bayesian computation in large linear models
    Wilkinson, DJ
    Yeung, SK
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2004, 44 (03) : 493 - 516
  • [5] An Empirical Study on Distributed Bayesian Approximation Inference of Piecewise Sparse Linear Models
    Asahara, Masato
    Fujimaki, Ryohei
    [J]. IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2019, 30 (07) : 1481 - 1493
  • [6] Experimental Design on a Budget for Sparse Linear Models and Applications
    Ravi, Sathya N.
    Ithapu, Vamsi K.
    Johnson, Sterling C.
    Singh, Vikas
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [7] On the sparse Bayesian learning of linear models
    Yee, Chia Chye
    Atchade, Yves F.
    [J]. COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2017, 46 (15) : 7672 - 7691
  • [8] Scalable Algorithms for Bayesian Inference of Large-Scale Models from Large-Scale Data
    Ghattas, Omar
    Isaac, Tobin
    Petra, Noemi
    Stadler, Georg
    [J]. HIGH PERFORMANCE COMPUTING FOR COMPUTATIONAL SCIENCE - VECPAR 2016, 2017, 10150 : 3 - 6
  • [9] Bayesian Inference of Conformational State Populations from Computational Models and Sparse Experimental Observables
    Voelz, Vincent A.
    Zhou, Guangfeng
    [J]. JOURNAL OF COMPUTATIONAL CHEMISTRY, 2014, 35 (30) : 2215 - 2224
  • [10] On approximate Bayesian methods for large-scale sparse linear inverse problems
    Altmann, Yoann
    [J]. 2022 30TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2022), 2022, : 2191 - 2195