On Possibilistic and Probabilistic Information Fusion

نویسنده

  • Ronald R. Yager
چکیده

This article discusses the basic features of information provided in terms of possibilistic uncertainty. It points out the entailment principle, a tool that allows one to infer less specific from a given piece of information. The problem of fusing multiple pieces of possibilistic information is and the basic features of probabilistic information are described. The authors detail a procedure for transforming information between possibilistic and probabilistic representations, and using this to form the basis for a technique for fusing multiple pieces of uncertain information, some of which is possibilistic and some probabilistic. A procedure is provided for addressing the problems that arise when the information to be fused has some conflicts. DOI: 10.4018/978-1-4666-1870-1.ch005

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Product-based Causal Networks and Quantitative Possibilistic Bases

In possibility theory, there are two kinds of possibilistic causal networks depending if possibilistic conditioning is based on the minimum or on the product operator. Similarly there are also two kinds of possibilistic logic: standard (min-based) possibilistic logic and quantitative (product-based) possibilistic logic. Recently, several equivalent transformations between standard possibilistic...

متن کامل

Reasoning with multiple-source information in a possibilistic logic framework

This paper addresses the problem of merging uncertain information in the framework of possibilistic logic. It presents several syntactic combination rules to merge possibilistic knowledge bases, provided by different sources, into a new possibilistic knowledge base. These combination rules are first described at the meta-level outside the language of possibilistic logic. Next, an extension of p...

متن کامل

A Naive Bayes Style Possibilistic Classifier

Naive Bayes classifiers can be seen as special probabilistic networks with a star-like structure. They can easily be induced from a dataset of sample cases. However, as most probabilistic approaches, they run into problems, if imprecise (i.e, set-valued) information in the data to learn from has to be taken into account. An approach to handle uncertain as well imprecise information, which recen...

متن کامل

IEEE IRI 2014 keynote speech (I): The information principle

The conventional wisdom is that the concept of information is closely related to the concept of probability. In Shannon's information theory, information is equated to a reduction in entropy—a probabilistic concept. In this paper, a different view of information is put on the table. Information is equated to restriction. More concretely, a restriction is a limitation on the values which a varia...

متن کامل

The Information Principle

The conventional wisdom is that the concept of information is closely related to the concept of probability. In Shannon's information theory, information is equated to a reduction in entropy—a probabilistic concept. In this paper, a different view of information is put on the table. Information is equated to restriction. More concretely, a restriction is a limitation on the values which a varia...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • IJFSA

دوره 1  شماره 

صفحات  -

تاریخ انتشار 2011