Granular neural networks: The development of granular input spaces and parameters spaces through a hierarchical allocation of information granularity

被引:6
|
作者
Song, Mingli [1 ,2 ]
Jing, Yukai [1 ]
机构
[1] Commun Univ China, Sch Comp Sci & Cybersecur, Beijing, Peoples R China
[2] Commun Univ China, State Key Lab Media Convergence & Commun, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Granular neural networks; Hierarchical allocation; Information granularity; PSO; Interval analysis; ALGORITHM; SYSTEMS; FUSION;
D O I
10.1016/j.ins.2019.12.081
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The issue of granular output optimization of neural networks with fixed connections within a given input space is explored. The numeric output optimization is a highly nonlinear problem if nonlinear activation functions are used; the granular output optimization becomes an even more challenging task. We solve the problem by developing an optimal hierarchical allocation of information granularity, proposing a new objective function which considers both specificity and evidence, and engaging here efficient techniques of evolutionary optimization. In contrast to the existing techniques, the hierarchical one builds a three-level hierarchy to allocate information granularity to the input space and the architecture (parameters) of the network. Granulating both the input features and the architecture at the same time return a different result with the single factor granulation. The constructed granular neural network emphasizes the abstract nature of data and the granular nature of nonlinear mapping of the architecture. Experimental studies completed for synthetic data and publicly available data sets are used to realize the algorithm. (C) 2020 Elsevier Inc. All rights reserved.
引用
收藏
页码:148 / 166
页数:19
相关论文
共 43 条