نتایج جستجو برای: entropy measure
تعداد نتایج: 404955 فیلتر نتایج به سال:
A new concept of the inclusion measure for intuitionistic fuzzy sets is proposed by the axiomatic definition. Also, the distance measures, similarity measures and information entropy are recalled and summarized. Most of important, some relationships among distance measure, information entropy, and inclusion measure of IFSs are then investigated. Finally, we obtain some important theorems by whi...
The notion of the informational measure symmetry is introduced according to: Hsym(G)=−∑i=1kP(Gi)lnP(Gi), where P(Gi) probability appearance operation Gi within given 2D pattern. Hsym(G) interpreted as an averaged uncertainty in presence elements from group G “ideal” pattern built identical equilateral triangles established Hsym(D3)= 1.792. random, completely disordered zero, Hsym=0. calculated ...
The focus of the paper is to furnish the entropy measure for a neutrosophic set and neutrosophic soft set which is a measure of uncertainty and it permeates discourse and system. Various characterization of entropy measures are derived. Further we exemplify this concept by applying entropy in various real time decision making problems. Keywords—Entropy measure, Hausdorff distance, neutrosophic ...
In the present communication, the existing measures of fuzzy entropy are reviewed. A generalized parametric exponential fuzzy entropy is defined.Our study of the four essential and some other properties of the proposed measure, clearly establishes the validity of the measure as an entropy. Keywords—fuzzy sets, fuzzy entropy, exponential entropy, exponential fuzzy entropy.
In this paper, we introduce a goodness of fit test for expo- nentiality based on Lin-Wong divergence measure. In order to estimate the divergence, we use a method similar to Vasicek’s method for estimat- ing the Shannon entropy. The critical values and the powers of the test are computed by Monte Carlo simulation. It is shown that the proposed test are competitive with other tests of exponentia...
The measure-theoretic definition of Kullback-Leibler relative-entropy (or simply KLentropy) plays a basic role in defining various classical information measures on general spaces. Entropy, mutual information and conditional forms of entropy can be expressed in terms of KL-entropy and hence properties of their measure-theoretic analogs will follow from those of measure-theoretic KL-entropy. The...
Shannnon entropy is an efficient tool to measure uncertain information. However, it cannot handle the more uncertain situation when the uncertainty is represented by basic probability assignment (BPA), instead of probability distribution, under the framework of Dempster Shafer evidence theory. To address this issue, a new entropy, named as Deng entropy, is proposed. The proposed Deng entropy is...
We develop a fine-scale local analysis of measure entropy and measure sequence entropy based on combinatorial independence. The concepts of measure IE-tuples and measure IN-tuples are introduced and studied in analogy with their counterparts in topological dynamics. Local characterizations of the Pinsker von Neumann algebra and its sequence entropy analogue are given in terms of combinatorial i...
For noninvertible maps, mainly subshifts of finite type and piecewise monotone interval maps, we investigate what happens if we follow backward trajectories, random in the sense that at each step every preimage can be chosen with equal probability. In particular, we ask what happens if we try to compute the entropy this way. It tuns out that instead of the topological entropy we get the metric ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید