Deep Fuzzy Min-Max Neural Network: Analysis and Design

被引:9
|
作者
Huang, Wei [1 ]
Sun, Mingxi [2 ]
Zhu, Liehuang [3 ]
Oh, Sung-Kwun [4 ]
Pedrycz, Witold [5 ]
机构
[1] Beijing Inst Technol, Sch Cyberspace Secur & Technol, Beijing, Peoples R China
[2] Tianjin Univ Technol, Sch Comp Sci & Engn, Tianjin, Peoples R China
[3] Beijing Inst Technol, Sch Cyberspace Secur & Technol, Beijing, Peoples R China
[4] Univ Suwon, Sch Elect & Elect Engn, Hwaseong, South Korea
[5] Univ Alberta, Dept Elect & Comp Engn, Edmonton, AB, Canada
基金
中国国家自然科学基金;
关键词
Deep fuzzy min-max neural network (DFMNN); fuzzy min-max neural network (FMNN); hyperbox; overlap; COMPUTATIONAL INTELLIGENCE ALGORITHM; OPTIMIZATION; RULE;
D O I
10.1109/TNNLS.2022.3226040
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Fuzzy min-max neural network (FMNN) is one kind of three-layer models based on hyperboxes that are constructed in a sequential way. Such a sequential mechanism inevitably leads to the input order and overlap region problem. In this study, we propose a deep FMNN (DFMNN) based on initialization and optimization operation to overcome these limitations. Initialization operation that can solve the input order problem is to design hyperboxes in a simultaneous way, and side parameters have been proposed to control the size of hyperboxes. Optimization operation that can eliminate overlap region problem is realized by means of deep layers, where the number of layers is immediately determined when the overlap among hyperboxes is eliminated. In the optimization process, each layer consists of three sections, namely, the partition section, combination section, and union section. The partition section aims to divide the hyperboxes into a nonoverlapping hyperbox set and an overlapping hyperbox set. The combination section eliminates the overlap problem of overlapping hyperbox set. The union section obtains the optimized hyperbox set in the current layer. DFMNN is evaluated based on a series of benchmark datasets. A comparative analysis illustrates that the proposed DFMNN model outperforms several models previously reported in the literature.
引用
收藏
页码:8229 / 8240
页数:12
相关论文
共 50 条
  • [1] Redefined Fuzzy Min-Max Neural Network
    Wang, Yage
    Huang, Wei
    Wang, Jinsong
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [2] Fuzzy min-max neural network for image segmentation
    Estévez, PA
    Ruz, GA
    Perez, CA
    [J]. PROCEEDINGS OF THE 7TH JOINT CONFERENCE ON INFORMATION SCIENCES, 2003, : 655 - 659
  • [3] A General Reflex Fuzzy Min-Max Neural Network
    Nandedkar, A. V.
    Biswas, P. K.
    [J]. ENGINEERING LETTERS, 2007, 14 (01)
  • [4] Fuzzy min-max neural networks: a bibliometric and social network analysis
    Kenger, Omer Nedim
    Ozceylan, Eren
    [J]. NEURAL COMPUTING & APPLICATIONS, 2023, 35 (07): : 5081 - 5111
  • [5] An Improved Fuzzy Min-Max Neural Network for Data Classification
    Kumar, Santhos A.
    Kumar, Anil
    Bajaj, Varun
    Singh, Girish Kumar
    [J]. IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2020, 28 (09) : 1910 - 1924
  • [6] Efficient Fuzzy Min-Max Neural Network for Pattern Classification
    Anand, Milind
    Kanth, Ravi R.
    Dhabu, Meera
    [J]. SMART TRENDS IN INFORMATION TECHNOLOGY AND COMPUTER COMMUNICATIONS, SMARTCOM 2016, 2016, 628 : 840 - 846
  • [7] Agglomerative learning for general fuzzy min-max neural network
    Gabrys, B
    [J]. NEURAL NETWORKS FOR SIGNAL PROCESSING X, VOLS 1 AND 2, PROCEEDINGS, 2000, : 692 - 701
  • [8] Agglomerative learning for general fuzzy min-max neural network
    Gabrys, Bogdan
    [J]. Neural Networks for Signal Processing - Proceedings of the IEEE Workshop, 2000, 2 : 692 - 701
  • [9] An Enhanced Fuzzy Min-Max Neural Network for Pattern Classification
    Mohammed, Mohammed Falah
    Lim, Chee Peng
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (03) : 417 - 429
  • [10] Fuzzy min-max neural network based decision trees
    Mirzamomen, Zahra
    Kangavari, Mohammad Reza
    [J]. INTELLIGENT DATA ANALYSIS, 2016, 20 (04) : 767 - 782