Exploiting Determinism to Scale Relational Inference

被引:0
|
作者
Ibrahim, Mohamed-Hamza [1 ]
Pal, Christopher [1 ]
Pesant, Gilles [1 ]
机构
[1] Univ Montreal, Ecole Polytech Montreal, Dept Comp & Software Engn, 2500 Chemin Polytech, Montreal, PQ, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
BELIEF PROPAGATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
One key challenge in statistical relational learning (SRL) is scalable inference. Unfortunately, most real-world problems in SRL have expressive models that translate into large grounded networks, representing a bottleneck for any inference method and weakening its scalability. In this paper we introduce Preference Relaxation (PR), a two-stage strategy that uses the determinism present in the underlying model to improve the scalability of relational inference. The basic idea of PR is that if the underlying model involves mandatory (i.e. hard) constraints as well as preferences (i.e. soft constraints) then it is potentially wasteful to allocate memory for all constraints in advance when performing inference. To avoid this, PR starts by relaxing preferences and performing inference with hard constraints only. It then removes variables that violate hard constraints, thereby avoiding irrelevant computations involving preferences. In addition it uses the removed variables to enlarge the evidence database. This reduces the effective size of the grounded network. Our approach is general and can be applied to various inference methods in relational domains. Experiments on real-world applications show how PR substantially scales relational inference with a minor impact on accuracy.
引用
收藏
页码:1756 / 1762
页数:7
相关论文
共 50 条
  • [41] Stochastic relational processes: Efficient inference and applications
    Ingo Thon
    Niels Landwehr
    Luc De Raedt
    Machine Learning, 2011, 82 : 239 - 272
  • [42] Unsupervised relational inference using masked reconstruction
    Gerrit Großmann
    Julian Zimmerlin
    Michael Backenköhler
    Verena Wolf
    Applied Network Science, 8
  • [43] Modeling and Inference with Relational Dynamic Bayesian Networks
    Manfredotti, Cristina
    ADVANCES IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2009, 5549 : 287 - +
  • [44] A short note on fuzzy relational inference systems
    Stepnicka, Martin
    Jayaram, Balasubramaniam
    Su, Yong
    FUZZY SETS AND SYSTEMS, 2018, 338 : 90 - 96
  • [45] Unsupervised relational inference using masked reconstruction
    Grossmann, Gerrit
    Zimmerlin, Julian
    Backenkoehler, Michael
    Wolf, Verena
    APPLIED NETWORK SCIENCE, 2023, 8 (01)
  • [46] Guiding inference through relational reinforcement learning
    Asgharbeygi, N
    Nejati, N
    Langley, P
    Arai, S
    INDUCTIVE LOGIC PROGRAMMING, PROCEEDINGS, 2005, 3625 : 20 - 37
  • [47] A mathematical theory of relational generalization in transitive inference
    Lippl, Samuel
    Kay, Kenneth
    Jensen, Greg
    Ferrera, Vincent P.
    Abbott, L. F.
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2024, 121 (28)
  • [48] Analogical Inference for Multi-relational Embeddings
    Liu, Hanxiao
    Wu, Yuexin
    Yang, Yiming
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [49] On the complexity of inference about probabilistic relational models
    Jaeger, M
    ARTIFICIAL INTELLIGENCE, 2000, 117 (02) : 297 - 308
  • [50] Neural relational and dynamics inference for complex systems
    Zhang, Fan
    Zhu, Tianyu
    Shi, Xinli
    Cao, Jinde
    Abdel-Aty, Mahmoud
    COMPUTERS & INDUSTRIAL ENGINEERING, 2024, 197