HIERARCHICAL LEARNING OF SPARSE IMAGE REPRESENTATIONS USING STEERED MIXTURE-OF-EXPERTS

被引:0
|
作者
Jongebloed, Rolf [1 ]
Verhack, Ruben [1 ,2 ]
Lange, Lieven [1 ]
Sikora, Thomas [1 ]
机构
[1] Tech Univ Berlin, Commun Syst Lab, Berlin, Germany
[2] Univ Ghent, IMEC, IDLab, Dept Elect & Informat Syst ELIS, Ghent, Belgium
关键词
Steered Mixture-of-Experts; Sparse Representation; Hidden Markov Random Field; Denoising; Image Signal Processing; Inference;
D O I
暂无
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Previous research showed highly efficient compression results for low bit-rates using Steered Mixture-of-Experts (SMoE), higher rates still pose a challenge due to the non-convex optimization problem that becomes more difficult when increasing the number of components. Therefore, a novel estimation method based on Hidden Markov Random Fields is introduced taking spatial dependencies of neighboring pixels into account combined with a tree-structured splitting strategy. Experimental evaluations for images show that our approach outperforms state-of-the-art techniques using only one robust parameter set. For video and light field modeling even more gain can be expected.
引用
收藏
页数:6
相关论文
共 50 条
  • [31] Distributed Mixture-of-Experts for Big Data using PETUUM framework
    Peralta, Billy
    Parra, Luis
    Herrera, Oriel
    Caro, Luis
    [J]. 2017 36TH INTERNATIONAL CONFERENCE OF THE CHILEAN COMPUTER SCIENCE SOCIETY (SCCC), 2017,
  • [32] 5-D Epanechnikov Mixture-of-Experts in Light Field Image Compression
    Liu, Boning
    Zhao, Yan
    Jiang, Xiaomeng
    Ji, Xingguang
    Wang, Shigang
    Liu, Yebin
    Wei, Jian
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 : 4029 - 4043
  • [33] Learning Image Representations from the Pixel Level via Hierarchical Sparse Coding
    Yu, Kai
    Lin, Yuanqing
    Lafferty, John
    [J]. 2011 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2011, : 1713 - 1720
  • [34] Global/Local Hybrid Learning of Mixture-of-Experts from Labeled and Unlabeled Data
    Yoon, Jong-Won
    Cho, Sung-Bae
    [J]. HYBRID ARTIFICIAL INTELLIGENT SYSTEMS, PART I, 2011, 6678 : 452 - 459
  • [35] Unsupervised Cross-Domain Image Retrieval with Semantic-Attended Mixture-of-Experts
    Wang, Kai
    Liu, Jiayang
    Xu, Xing
    Song, Jingkuan
    Liu, Xin
    Shen, Heng Tao
    [J]. PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 197 - 207
  • [36] Learning Hierarchical Sparse Representations using Iterative Dictionary Learning and Dimension Reduction
    Tarifi, Mohamad
    Sitharam, Meera
    Ho, Jeffery
    [J]. BIOLOGICALLY INSPIRED COGNITIVE ARCHITECTURES 2011, 2011, 233 : 383 - 388
  • [37] Mixture of experts classification using a hierarchical mixture model
    Titsias, MK
    Likas, A
    [J]. NEURAL COMPUTATION, 2002, 14 (09) : 2221 - 2244
  • [38] Learning Word Representations with Hierarchical Sparse Coding
    Yogatama, Dani
    Faruqui, Manaal
    Dyer, Chris
    Smith, Noah A.
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 87 - 96
  • [39] Learning sparse, overcomplete image representations
    Olshausen, BA
    Millman, KJ
    [J]. WAVELET APPLICATIONS IN SIGNAL AND IMAGE PROCESSING VIII PTS 1 AND 2, 2000, 4119 : 445 - 452
  • [40] Towards Crowdsourced Training of Large Neural Networks using Decentralized Mixture-of-Experts
    Ryabinin, Max
    Gusev, Anton
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33