Exploring unsupervised pre-training for echo state networks

被引:0
|
作者
Peter Steiner
Azarakhsh Jalalvand
Peter Birkholz
机构
[1] Technische Universität Dresden,Institute for Acoustics and Speech Communication
[2] Princeton University,Mechanical and Aerospace Engineering Department
来源
关键词
ESN; RCN; Clustering; State machine;
D O I
暂无
中图分类号
学科分类号
摘要
Echo State Networks (ESNs) are a special type of Recurrent Neural Networks (RNNs), in which the input and recurrent connections are traditionally generated randomly, and only the output weights are trained. However, recent publications have addressed the problem that a purely random initialization may not be ideal. Instead, a completely deterministic or data-driven initialized ESN structure was proposed. In this work, an unsupervised training methodology for the hidden components of an ESN is proposed. Motivated by traditional Hidden Markov Models (HMMs), which have been widely used for speech recognition for decades, we present an unsupervised pre-training method for the recurrent weights and bias weights of ESNs. This approach allows for using unlabeled data during the training procedure and shows superior results for continuous spoken phoneme recognition, as well as for a large variety of time-series classification datasets.
引用
收藏
页码:24225 / 24242
页数:17
相关论文
共 50 条
  • [1] Exploring unsupervised pre-training for echo state networks
    Steiner, Peter
    Jalalvand, Azarakhsh
    Birkholz, Peter
    [J]. NEURAL COMPUTING & APPLICATIONS, 2023, 35 (34): : 24225 - 24242
  • [2] Evolutionary pre-training for CRJ-type reservoir of echo state networks
    Yuenyong, Sumeth
    Nishihara, Akinori
    [J]. NEUROCOMPUTING, 2015, 149 : 1324 - 1329
  • [3] Unsupervised Pre-training for Fully Convolutional Neural Networks
    Wiehman, Stiaan
    Kroon, Steve
    de Villiers, Hendrik
    [J]. 2016 PATTERN RECOGNITION ASSOCIATION OF SOUTH AFRICA AND ROBOTICS AND MECHATRONICS INTERNATIONAL CONFERENCE (PRASA-ROBMECH), 2016,
  • [4] Unsupervised Pre-Training for Detection Transformers
    Dai, Zhigang
    Cai, Bolun
    Lin, Yugeng
    Chen, Junying
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (11) : 12772 - 12782
  • [5] Unsupervised Pre-Training for Voice Activation
    Kolesau, Aliaksei
    Sesok, Dmitrij
    [J]. APPLIED SCIENCES-BASEL, 2020, 10 (23): : 1 - 13
  • [6] Unsupervised Pre-Training with Spiking Neural Networks in Semi-Supervised Learning
    Dorogyy, Yaroslav
    Kolisnichenko, Vadym
    [J]. 2018 IEEE FIRST INTERNATIONAL CONFERENCE ON SYSTEM ANALYSIS & INTELLIGENT COMPUTING (SAIC), 2018, : 177 - 180
  • [7] Neural speech enhancement with unsupervised pre-training and mixture training
    Hao, Xiang
    Xu, Chenglin
    Xie, Lei
    [J]. NEURAL NETWORKS, 2023, 158 : 216 - 227
  • [8] Behavior From the Void: Unsupervised Active Pre-Training
    Liu, Hao
    Abbeel, Pieter
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [9] Unsupervised Pre-training for Person Re-identification
    Fu, Dengpan
    Chen, Dongdong
    Bao, Jianmin
    Yang, Hao
    Yuan, Lu
    Zhang, Lei
    Li, Houqiang
    Chen, Dong
    [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 14745 - 14754
  • [10] Unsupervised Pre-training for Temporal Action Localization Tasks
    Zhang, Can
    Yang, Tianyu
    Weng, Junwu
    Cao, Meng
    Wang, Jue
    Zou, Yuexian
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 14011 - 14021