Group-linking method: A unified benchmark for machine learning with recurrent neural network

被引:0
|
作者
Lin, Tsungnan [1 ,2 ]
Giles, C. Lee [3 ]
机构
[1] Natl Taiwan Univ, Dept Elect Engn, Taipei 10764, Taiwan
[2] Natl Taiwan Univ, Grad Inst Commun Engn, Taipei 10764, Taiwan
[3] Penn State Univ, eBusiness Res Ctr, University Pk, PA 16802 USA
关键词
recurrent neural networks; finite state machines; grammatical inference; NARX neural networks;
D O I
10.1093/ietfec/e90-a.12.2916
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
This paper proposes a method (Group-Linking Method) that has control over the complexity of the sequential function to construct Finite Memory Machines with minimal order-the machines have the largest number of states based on their memory taps. Finding a machine with maximum number of states is a nontrivial problem because the total number of machines with memory order k is (256)(2k-2), a pretty large number. Based on the analysis of Group-Linking Method, it is shown that the amount of data necessary to reconstruct an FMM is the set of strings not longer than the depth of the machine plus one, which is significantly less than that required for traditional greedy-based machine learning algorithm. Group-Linking Method provides a useful systematic way of generating unified benchmarks to evaluate the capability of machine learning techniques. One example is to test the learning capability of recurrent neural networks. The problem of encoding finite state machines with recurrent neural networks has been extensively explored. However, the great representation power of those networks does not guarantee the solution in terms of learning exists. Previous learning benchmarks are shown to be not rich enough structurally in term of solutions in weight space. This set of benchmarks with great expressive power can serve as a convenient framework in which to study the learning and computation capabilities of various network models. A fundamental understanding of the capabilities of these networks will allow users to be able to select the most appropriate model for a given application.
引用
收藏
页码:2916 / 2929
页数:14
相关论文
共 50 条
  • [1] A Novel Method for Generating Benchmark Functions Using Recurrent Neural Network
    Sun, Fengyang
    Wang, Lin
    Yang, Bo
    Zhou, Jin
    Chen, Zhenxiang
    INTELLIGENT COMPUTING THEORIES AND APPLICATION, ICIC 2017, PT I, 2017, 10361 : 768 - 773
  • [2] Neural Collective Entity Linking Based on Recurrent Random Walk Network Learning
    Xue, Mengge
    Cai, Weiming
    Su, Jinsong
    Song, Linfeng
    Ge, Yubin
    Liu, Yubao
    Wang, Bin
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 5327 - 5333
  • [3] Machine learning holographic mapping by neural network renormalization group
    Hu, Hong-Ye
    Li, Shuo-Hui
    Wang, Lei
    You, Yi-Zhuang
    PHYSICAL REVIEW RESEARCH, 2020, 2 (02):
  • [4] Cooperative Recurrent Neural Network for Multiclass Support Vector Machine Learning
    Yu, Ying
    Xia, Youshen
    Kamel, Mohamed
    ADVANCES IN NEURAL NETWORKS - ISNN 2009, PT 2, PROCEEDINGS, 2009, 5552 : 276 - +
  • [5] Data stream classification using a deep transfer learning method based on extreme learning machine and recurrent neural network
    Eskandari, Mehdi
    Khotanlou, Hassan
    MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (23) : 63213 - 63241
  • [6] LEARNING A TRANSFERABLE CHANGE DETECTION METHOD BY RECURRENT NEURAL NETWORK
    Lyu, Haobo
    Lu, Hui
    2016 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2016, : 5157 - 5160
  • [7] Heterogeneous Network Representation Learning: A Unified Framework With Survey and Benchmark
    Yang, Carl
    Xiao, Yuxin
    Zhang, Yu
    Sun, Yizhou
    Han, Jiawei
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (10) : 4854 - 4873
  • [8] An intelligent radar signal classification and deinterleaving method with unified residual recurrent neural network
    Al-Malahi, Abdulrahman
    Farhan, Abubaker
    Feng, HanCong
    Almaqtari, Omar
    Tang, Bin
    IET RADAR SONAR AND NAVIGATION, 2023, 17 (08): : 1259 - 1276
  • [9] Modern Machine Learning as a Benchmark for Fitting Neural Responses
    Benjamin, Ari S.
    Fernandes, Hugo L.
    Tomlinson, Tucker
    Ramkumar, Pavan
    VerSteeg, Chris
    Chowdhury, Raeed H.
    Miller, Lee E.
    Kording, Konrad P.
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2018, 12
  • [10] Quantitative Trading Method based on Neural Network Machine Learning
    Weng, Weinan
    2022 ASIA CONFERENCE ON ALGORITHMS, COMPUTING AND MACHINE LEARNING (CACML 2022), 2022, : 600 - 603