نتایج جستجو برای: information entropy
تعداد نتایج: 1203337 فیلتر نتایج به سال:
Entropy is well known to be Schur concave on finite alphabets. Recently, the authors have strengthened the result by showing that for any pair of probability distributions P and Q with Q majorized by P , the entropy of Q is larger than the entropy of P by the amount of relative entropy D(P ||Q). This result applies to P and Q defined on countable alphabets. This paper shows the counterpart of t...
Shannon entropy of a probability measure P , defined as − ∫ X dP dμ ln dP dμ dμ on a measure space (X,M, μ), is not a natural extension from the discrete case. However, maximum entropy (ME) prescriptions of Shannon entropy functional in the measure-theoretic case are consistent with those for the discrete case. Also it is well known that Kullback-Leibler relative entropy can be extended natural...
This paper deals with the estimation of transfer entropy based on the k-nearest neighbors (k-NN) method. To this end, we first investigate the estimation of Shannon entropy involving a rectangular neighboring region, as suggested in already existing literature, and develop two kinds of entropy estimators. Then, applying the widely-used error cancellation approach to these entropy estimators, we...
We discuss a one-parameter family of generalized cross entropy between two distributions with the power index, called the projective power entropy. The cross entropy is essentially reduced to the Tsallis entropy if two distributions are taken to be equal. Statistical and probabilistic properties associated with the projective power entropy are extensively investigated including a characterizati...
The concept of entropy plays a major part in communication theory. The Shannon entropy is a measure of uncertainty with respect to a priori probability distribution. In algorithmic information theory the information content of a message is measured in terms of the size in bits of the smallest program for computing that message. This paper discusses the classical entropy and entropy rate for dis...
In the fuzzy set theory, information measures play a paramount role in several areas such as decision making, pattern recognition etc. In this paper, similarity measure based on cosine function and entropy measures based on logarithmic function for IFSs are proposed. Comparisons of proposed similarity and entropy measures with the existing ones are listed. Numerical results limpidly betoken th...
A compact fin-tube heat exchanger is used to transfer current fluid heat inside the tubes into the air outside. In this study, entropy production and optimized Reynolds number for finned-tube heat exchangers based on the minimum entropy production have been investigated. As a result, the total entropy of compact heat exchangers, which is the summation of the production rate of fluid entropy ins...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید