HIGH-ORDER NETWORKS THAT LEARN TO SATISFY LOGIC CONSTRAINTS

被引:0
|
作者
Pinkas, Gadi [1 ]
Cohen, Shimon [1 ]
机构
[1] Afeka Tel Aviv Acad Coll Engn, Afeka Ctr Language Proc, Tel Aviv, Israel
关键词
artificial neural networks; planning as SAT; constraint satisfaction; unsupervised learning; logic; neural-symbolic integration; high-order neural connections; NEURAL-NETWORKS; ALGORITHM; KNOWLEDGE;
D O I
暂无
中图分类号
B81 [逻辑学(论理学)];
学科分类号
010104 ; 010105 ;
摘要
Logic-based problems such as planning, formal verification and inference, typically involve combinatorial search and structured knowledge representation. Artificial neural networks (ANNs) are very successful statistical learners; however, they have been criticized for their weaknesses in representing and in processing complex structured knowledge which is crucial for combinatorial search and symbol manipulation. Two high-order neural architectures are presented (Symmetric and RNN), which can encode structured relational knowledge in neural activation, and store bounded First Order Logic (FOL) constraints in connection weights. Both architectures learn to search for a solution that satisfies the constraints. Learning is done by unsupervised "practicing" on problem instances from the same domain, in a way that improves the network-solving speed. No teacher exists to provide answers for the problem instances of the training and test sets. However, the domain constraints are provided as prior knowledge encoded in a loss function that measures the degree of constraint violations. Iterations of activation calculation and learning are executed until a solution that maximally satisfies the constraints emerges on the output units. As a test case, block-world planning problems are used to train flat networks with high-order connections that learn to plan in that domain, but the techniques proposed could be used more generally as in integrating prior symbolic knowledge with statistical learning.
引用
收藏
页码:653 / 693
页数:41
相关论文
共 50 条
  • [41] Adaptive Fuzzy Control With High-Order Barrier Lyapunov Functions for High-Order Uncertain Nonlinear Systems With Full-State Constraints
    Sun, Wei
    Su, Shun-Feng
    Wu, Yuqiang
    Xia, Jianwei
    Van-Truong Nguyen
    IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (08) : 3424 - 3432
  • [42] High-Order Modulation for Small Cell Networks: A High Level Analysis
    Chen, Minqi
    Zou, Jun
    Wang, Mao
    2017 17TH IEEE INTERNATIONAL CONFERENCE ON COMMUNICATION TECHNOLOGY (ICCT 2017), 2017, : 43 - 49
  • [43] HIGH-ORDER REFERENCE SYMBOLS AND HIGH-ORDER LINEAR CONNECTIONS
    CHENG, YP
    COMPTES RENDUS HEBDOMADAIRES DES SEANCES DE L ACADEMIE DES SCIENCES SERIE A, 1970, 270 (15): : 957 - &
  • [44] Compositional adjustment of concurrent programs to satisfy temporal logic constraints in MENDELS ZONE
    Toshiba Corp, Kawasaki, Japan
    J Syst Software, 3 (207-221):
  • [45] Compositional adjustment of concurrent programs to satisfy temporal logic constraints in MENDELS ZONE
    Uchihira, N
    Honiden, S
    JOURNAL OF SYSTEMS AND SOFTWARE, 1996, 33 (03) : 207 - 221
  • [46] COST-PERFORMANCE EVALUATION OF ANALOG NEURAL NETWORKS AND HIGH-ORDER NETWORKS
    SIPPER, M
    YESHURUN, Y
    NEUROCOMPUTING, 1994, 6 (03) : 291 - 303
  • [47] Resilient consensus of high-order networks against collusive attacks
    Zhao, Dan
    Lv, Yuezu
    Wen, Guanghui
    Gao, Zhiwei
    AUTOMATICA, 2023, 151
  • [48] Stability analysis of high-order analog Hopfield neural networks
    Su, J
    He, ZY
    PROGRESS IN CONNECTIONIST-BASED INFORMATION SYSTEMS, VOLS 1 AND 2, 1998, : 225 - 228
  • [49] Analysis of quantization effects on high-order function neural networks
    Minghu Jiang
    Georges Gielen
    Applied Intelligence, 2008, 28 : 51 - 67
  • [50] High-order rotor Hopfield neural networks for associative memory
    Chen, Bingxuan
    Zhang, Hao
    NEUROCOMPUTING, 2025, 616