Maximum Entropy Inference with Quantified Knowledge

نویسندگان

  • Owen Barnett
  • Jeff B. Paris
چکیده

We investigate uncertain reasoning with quantified sentences of the predicate calculus treated as the limiting case of maximum entropy inference applied to finite domains. Motivation and notation In this modest note we consider one possible approach to the following problem P: Suppose that my subjective beliefs in some sentences θ1, θ2, . . . , θm of a predicate language are constrained to satisfy a certain set K, say, of linear constraints. In that case what belief should I assign to some other sentence φ ? Following Johnson [14] and Carnap et al [4], [5] we shall limit ourselves to the well studied case where the overlying predicate language L contains just finitely many unary predicate symbols, P1, P2, . . . , Pt and denumerably many constant symbols a1, a2, a3, . . . , the intention here being that these constants are distinct and exhaust the universe. In particular the language does not have equality nor any functions symbols. Then according to ideas of de Finetti [6], Gaifman [10], ∗Supported by a UK Engineering and Physical Sciences Research Council (EPSRC) Research Studentship

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Inference Processes for Probabilistic First Order Languages

In this thesis we will investigate inference processes for predicate languages. The main question we are concerned with in this thesis is how to choose a probability function amongst those that satisfy a certain knowledge base. This question has been extensively studied for propositional logic and we shall investigate it for rst order languages. We will rst study the generalisation of Minimum D...

متن کامل

Entropy-driven inference and inconsistency

Probability distributions on a set of discrete variables are a suitable means to represent knowledge about their respective mutual dependencies. When now things become evident such a distribution can be adapted to the new situation and hence submitted to a sound inference process. Knowledge acquisition and inference are here performed in the rich syntax of conditional events. Both, acquisition ...

متن کامل

Sine Entropy for uncertain Variables

Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum en...

متن کامل

The Maximum Entropy Method for Lifetime Distributions

An approach to produce a model for the data generating distribution is the well-known maximum entropy method. In this approach, the partial knowledge about the data generating distribution is formulated in terms of a set of information constraints, usually moment constraints, and the inference is based on the model that maximizes Shannon’s entropy under these constraints. In this paper we inves...

متن کامل

Evolving Knowledge in Theory and Applications

Semantics for formal models of probabilistic reasoning rely on probability functions that are defined on the interpretations of the underlying classical logic. When this underlying logic is of relational nature, i. e. a fragment of first-order logic, then the space needed for representing these probability functions explicitly is exponential in both the number of predicates and the number of do...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Logic Journal of the IGPL

دوره 16  شماره 

صفحات  -

تاریخ انتشار 2008