Neural Network Learning to Discover Laws Ruling Noisy Empirical Data

被引:0
|
作者
Majewski, Jaroslaw [1 ]
Wojtyna, Ryszard [1 ]
机构
[1] Univ Technol & Life Sci, Fac Telecommun Comp Sci & Elect Engn, Ul Kaliskiego 7, PL-85796 Bydgoszcz, Poland
来源
2014 SIGNAL PROCESSING: ALGORITHMS, ARCHITECTURES, ARRANGEMENTS, AND APPLICATIONS (SPA) | 2014年
关键词
Neural networks; ANN training in the presence of noise; rules governing numerical data; symbolic description;
D O I
暂无
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Improvement in learning effectiveness of special neural networks (SNN) aiding the process of finding out hidden rules governing a given empirical data set is the topic of discussion in this paper. The SNN's are based on the 1/(.) type reciprocal functions, used as activation ones. The functions are located mainly in hidden layer and input nodes of the network. This is a specific characteristic of our SNN's. The SNN structure is simpler compared with other networks applied for solving similar problems [1-15]. Previous attempts to train such networks have not led do fully satisfactory results [16], [17]. One of the main reasons for that is noise encountered in the considered discrete empirical date. In this paper, a new methodology of the SNN training is presented. The proposed approach relies on introducing to the learning technique suitably prepared knowledge base in order to cope with the problem of adverse influence of noise on the training effects. In this way it is possible, for example, to eliminate from the learning process some unwanted rises of the SNN weights if it is assumed that the symbolic law description of a given data set, to be determined, has a monotonically-decreasing-function form. Results of learning with and without the use of the knowledge base are compared and superiority of the proposed approach over the previously presented ones is shown. The presented description and achieved results are restricted, for simplicity reasons, to one-dimensional relationship.
引用
收藏
页码:31 / 35
页数:5
相关论文
共 50 条
  • [21] Meta-Learning for Decoding Neural Activity Data With Noisy Labels
    Xu, Dongfang
    Chen, Rong
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2022, 16
  • [22] Machine Learning and Neural Networks Tools to Address Noisy Data Issues
    Artese, Maria Teresa
    Gagliardi, Isabella
    DIGITAL PRESENTATION AND PRESERVATION OF CULTURAL AND SCIENTIFIC HERITAGE, 2021, 11 : 89 - 98
  • [23] Implicit Form Neural Network for Learning Scalar Hyperbolic Conservation Laws
    Zhang, Xiaoping
    Cheng, Tao
    Ju, Lili
    MATHEMATICAL AND SCIENTIFIC MACHINE LEARNING, VOL 145, 2021, 145 : 1082 - 1098
  • [24] Gaussian ARTMAP: A neural network for past incremental learning of noisy multidimensional maps
    Williamson, JR
    NEURAL NETWORKS, 1996, 9 (05) : 881 - 897
  • [25] Feature learning and network structure from noisy node activity data
    Kuang, Junyao
    Scoglio, Caterina
    Michel, Kristin
    PHYSICAL REVIEW E, 2022, 106 (06)
  • [26] Self-Error-Correcting Convolutional Neural Network for Learning with Noisy Labels
    Liu, Xin
    Li, Shaoxin
    Kan, Meina
    Shan, Shiguang
    Chen, Xilin
    2017 12TH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION (FG 2017), 2017, : 111 - 117
  • [27] PDE-LEARN : Using deep learning to discover partial differential equations from noisy, limited data
    Stephany, Robert
    Earls, Christopher
    NEURAL NETWORKS, 2024, 174
  • [28] ANALYTIC CONTINUATION OF NOISY DATA USING ADAMS BASHFORTH RESIDUAL NEURAL NETWORK
    Xie, Xuping
    Bao, Feng
    Maier, Thomas
    Webster, Clayton
    DISCRETE AND CONTINUOUS DYNAMICAL SYSTEMS-SERIES S, 2022, 15 (04): : 877 - 892
  • [29] The effects of data filtering on neural network learning
    Rosin, PL
    Fierens, F
    NEUROCOMPUTING, 1998, 20 (1-3) : 155 - 162
  • [30] Wavelet neural network based on sampling theory for non uniform noisy data
    Asl, Ehsan Hossaini
    Shahbazian, Mehdi
    Salahshoor, Karim
    COMPUTING AND COMPUTATIONAL TECHNIQUES IN SCIENCES, 2008, : 51 - +