Determining Maximal Entropy Functions for Objective Bayesian Inductive Logic

被引:0
|
作者
Juergen Landes
Soroush Rafiee Rad
Jon Williamson
机构
[1] University of Milan,Department of Philosophy “Piero Martinetti”
[2] Dutch Institute for Emergent Phenomena (DIEP) and Institute for Logic,Philosophy Department and Centre for Reasoning
[3] Language and Computation (ILLC),undefined
[4] University of Kent,undefined
来源
关键词
Inductive logic; Entropy; Maximum entropy principle; First order logic; Probability logic;
D O I
暂无
中图分类号
学科分类号
摘要
According to the objective Bayesian approach to inductive logic, premisses inductively entail a conclusion just when every probability function with maximal entropy, from all those that satisfy the premisses, satisfies the conclusion. When premisses and conclusion are constraints on probabilities of sentences of a first-order predicate language, however, it is by no means obvious how to determine these maximal entropy functions. This paper makes progress on the problem in the following ways. Firstly, we introduce the concept of a limit in entropy and show that, if the set of probability functions satisfying the premisses contains a limit in entropy, then this limit point is unique and is the maximal entropy probability function. Next, we turn to the special case in which the premisses are categorical sentences of the logical language. We show that if the uniform probability function gives the premisses positive probability, then the maximal entropy function can be found by simply conditionalising this uniform prior on the premisses. We generalise our results to demonstrate agreement between the maximal entropy approach and Jeffrey conditionalisation in the case in which there is a single premiss that specifies the probability of a sentence of the language. We show that, after learning such a premiss, certain inferences are preserved, namely inferences to inductive tautologies. Finally, we consider potential pathologies of the approach: we explore the extent to which the maximal entropy approach is invariant under permutations of the constants of the language, and we discuss some cases in which there is no maximal entropy probability function.
引用
收藏
页码:555 / 608
页数:53
相关论文
共 50 条