Two-layer tree-connected feed-forward neural network model for neural cryptography

被引:12
|
作者
Lei, Xinyu [1 ]
Liao, Xiaofeng [1 ]
Chen, Fei [2 ]
Huang, Tingwen [3 ]
机构
[1] Chongqing Univ, Coll Comp Sci, Chongqing 400044, Peoples R China
[2] Chinese Univ Hong Kong, Dept Comp Sci & Engn, Hong Kong, Hong Kong, Peoples R China
[3] Texas A&M Univ, Doha, Qatar
来源
PHYSICAL REVIEW E | 2013年 / 87卷 / 03期
基金
中国国家自然科学基金;
关键词
SYNCHRONIZATION; INFORMATION;
D O I
10.1103/PhysRevE.87.032811
中图分类号
O35 [流体力学]; O53 [等离子体物理学];
学科分类号
070204 ; 080103 ; 080704 ;
摘要
Neural synchronization by means of mutual learning provides an avenue to design public key exchange protocols, bringing about what is known as neural cryptography. Two identically structured neural networks learn from each other and reach full synchronization eventually. The full synchronization enables two networks to have the same weight, which can be used as a secret key for many subsequent cryptographic purposes. It is striking to observe that after the first decade of neural cryptography, the tree parity machine (TPM) network with hidden unit K = 3 appears to be the sole network that is suitable for a neural protocol. No convincingly secure neural protocol is well designed by using other network structures despite considerable research efforts. With the goal of overcoming the limitations of a suitable network structure, in this paper we develop a two-layer tree-connected feed-forward neural network (TTFNN) model for a neural protocol. The TTFNN model captures the notion that two partners are capable of exchanging a vector with multiple bits in each time step. An in-depth study of the dynamic process of TTFNN-based protocols is then undertaken, based upon which a feasible condition is theoretically obtained to seek applicable protocols. Afterward, according to two analytically derived heuristic rules, a complete methodology for designing feasible TTFNN-based protocols is elaborated. A variety of feasible neural protocols are constructed, which exhibit the effectiveness and benefits of the proposed model. With another look from the perspective of application, TTFNN-based instances, which can outperform the conventional TPM-based protocol with respect to synchronization speed, are also experimentally confirmed. DOI:10.1103/PhysRevE.87.032811
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Successive Feed-Forward Neural Network for Learning Fuzzy Decision Tree
    Singh, Manu Pratap
    Lavania, Rajesh
    [J]. PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON SOFT COMPUTING FOR PROBLEM SOLVING (SOCPROS 2011), VOL 1, 2012, 130 : 701 - 714
  • [2] Structure optimisation of input layer for feed-forward NARX neural network
    Li, Zongyan
    Best, Matt
    [J]. INTERNATIONAL JOURNAL OF MODELLING IDENTIFICATION AND CONTROL, 2016, 25 (03) : 217 - 226
  • [3] Design of an Interval Feed-Forward Neural Network
    Srivastava, Smriti
    Singh, Madhusudan
    [J]. PROCEEDINGS OF THE 2012 FIFTH INTERNATIONAL CONFERENCE ON EMERGING TRENDS IN ENGINEERING AND TECHNOLOGY (ICETET 2012), 2012, : 211 - 215
  • [4] Coherent feed-forward quantum neural network
    Singh, Utkarsh
    Goldberg, Aaron Z.
    Heshami, Khabat
    [J]. Quantum Machine Intelligence, 2024, 6 (02)
  • [5] Dynamic Successive Feed-Forward Neural Network for Learning Fuzzy Decision Tree
    Singh, Manu Pratap
    [J]. ROUGH SETS, FUZZY SETS, DATA MINING AND GRANULAR COMPUTING, RSFDGRC 2011, 2011, 6743 : 293 - 301
  • [6] An application of a feed-forward neural network model for wind speed predictions
    Kolokythas, K. V.
    Argiriou, A. A.
    [J]. INTERNATIONAL JOURNAL OF SUSTAINABLE ENERGY, 2022, 41 (04) : 323 - 340
  • [7] A novel wireless channel model with multiply feed-forward neural network
    Lv, Lin
    [J]. ICNC 2007: Third International Conference on Natural Computation, Vol 1, Proceedings, 2007, : 730 - 734
  • [8] Feed-forward neural networks
    Bebis, George
    Georgiopoulos, Michael
    [J]. IEEE Potentials, 1994, 13 (04): : 27 - 31
  • [9] On the feed-forward neural network for analyzing pantograph equations
    Az-Zo'bi, Emad A.
    Shah, Rasool
    Alyousef, Haifa A.
    Tiofack, C. G. L.
    El-Tantawy, S. A.
    [J]. AIP ADVANCES, 2024, 14 (02)
  • [10] An incremental learning preprocessor for feed-forward neural network
    Piyabute Fuangkhon
    [J]. Artificial Intelligence Review, 2014, 41 : 183 - 210