HIGH-ORDER NETWORKS THAT LEARN TO SATISFY LOGIC CONSTRAINTS

被引:0
|
作者
Pinkas, Gadi [1 ]
Cohen, Shimon [1 ]
机构
[1] Afeka Tel Aviv Acad Coll Engn, Afeka Ctr Language Proc, Tel Aviv, Israel
关键词
artificial neural networks; planning as SAT; constraint satisfaction; unsupervised learning; logic; neural-symbolic integration; high-order neural connections; NEURAL-NETWORKS; ALGORITHM; KNOWLEDGE;
D O I
暂无
中图分类号
B81 [逻辑学(论理学)];
学科分类号
010104 ; 010105 ;
摘要
Logic-based problems such as planning, formal verification and inference, typically involve combinatorial search and structured knowledge representation. Artificial neural networks (ANNs) are very successful statistical learners; however, they have been criticized for their weaknesses in representing and in processing complex structured knowledge which is crucial for combinatorial search and symbol manipulation. Two high-order neural architectures are presented (Symmetric and RNN), which can encode structured relational knowledge in neural activation, and store bounded First Order Logic (FOL) constraints in connection weights. Both architectures learn to search for a solution that satisfies the constraints. Learning is done by unsupervised "practicing" on problem instances from the same domain, in a way that improves the network-solving speed. No teacher exists to provide answers for the problem instances of the training and test sets. However, the domain constraints are provided as prior knowledge encoded in a loss function that measures the degree of constraint violations. Iterations of activation calculation and learning are executed until a solution that maximally satisfies the constraints emerges on the output units. As a test case, block-world planning problems are used to train flat networks with high-order connections that learn to plan in that domain, but the techniques proposed could be used more generally as in integrating prior symbolic knowledge with statistical learning.
引用
收藏
页码:653 / 693
页数:41
相关论文
共 50 条
  • [1] High-order networks that learn to satisfy logic constraints
    Pinkas, Gadi
    Cohen, Shimon
    Journal of Applied Logics, 2019, 6 (04): : 653 - 693
  • [2] High-order Hopfield neural networks
    Shen, Y
    Zong, XJ
    Jiang, MH
    ADVANCES IN NEURAL NETWORKS - ISNN 2005, PT 1, PROCEEDINGS, 2005, 3496 : 235 - 240
  • [4] NEURAL NETWORKS WITH HIGH-ORDER CONNECTIONS
    ARENZON, JJ
    DEALMEIDA, RMC
    PHYSICAL REVIEW E, 1993, 48 (05) : 4060 - 4069
  • [5] Research on high-order neural networks
    Wang, A-Ming
    Liu, Tian-Fang
    Wang, Xu
    Zhongguo Kuangye Daxue Xuebao/Journal of China University of Mining and Technology, 2003, 32 (02): : 177 - 179
  • [6] Stochastic high-order Hopfield neural networks
    Shen, Y
    Zhao, GY
    Jiang, MH
    Hu, SG
    ADVANCES IN NATURAL COMPUTATION, PT 1, PROCEEDINGS, 2005, 3610 : 740 - 749
  • [7] ON THE TRAINING AND PERFORMANCE OF HIGH-ORDER NEURAL NETWORKS
    KARAYIANNIS, NB
    VENETSANOPOULOS, AN
    MATHEMATICAL BIOSCIENCES, 1995, 129 (02) : 143 - 168
  • [8] DENSE MEMORY WITH HIGH-ORDER NEURAL NETWORKS
    JEFFRIES, C
    PROCEEDINGS : THE TWENTY-FIRST SOUTHEASTERN SYMPOSIUM ON SYSTEM THEORY, 1989, : 436 - 439
  • [9] APPLICATION OF HIGH-ORDER NEURAL NETWORKS IN CHEMISTRY
    KVASNICKA, V
    SKLENAK, S
    POSPICHAL, J
    THEORETICA CHIMICA ACTA, 1993, 86 (03): : 257 - 267
  • [10] HIGH-ORDER ABSOLUTELY STABLE NEURAL NETWORKS
    DEMBO, A
    FAROTIMI, O
    KAILATH, T
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS, 1991, 38 (01): : 57 - 65