نتایج جستجو برای: information entropy theory
تعداد نتایج: 1867273 فیلتر نتایج به سال:
The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite ...
Entropy is a useful concept that has been used to describe the structure and behavior of different systems. We summarize its multifaceted character with regard to its implications for urban sprawl, and propose a framework to apply the concept of entropy to urban sprawl for monitoring and management.
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p requires Θ(k/ log k) samples, a number that grows near-linearly in the support size. In many applications H(p) can be replaced by the more general Rényi entropy of order α, Hα(p). We determine the number of samples needed to estimate Hα(p) for all α, showing that α < 1 requires a super-linear, r...
In uncertainty theory, quadratic entropy is a type of entropy that provide a quantitative measurement of the uncertainty of uncertain variables. This paper presents the maximum entropy principle for quadratic entropy of uncertain variables, that is, out of all the uncertainty distributions satisfying given constraints, choose the maximum quadratic entropy one.
The paper examines relationships between the conditional Shannon entropy and the expectation of `α-norm for joint probability distributions. More precisely, we investigate the tight bounds of the expectation of `α-norm with a fixed conditional Shannon entropy, and vice versa. As applications of the results, we derive the tight bounds between the conditional Shannon entropy and several informati...
The concept of entropy plays a major part in communication theory. The Shannon entropy is a measure of uncertainty with respect to a priori probability distribution. In algorithmic information theory the information content of a message is measured in terms of the size in bits of the smallest program for computing that message. This paper discusses the classical entropy and entropy rate for dis...
Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum en...
Theories and Results: Generalizing fractional kinetics successfully modeling anomalous diffusion, a theory for describing the infection pathway of the virus over the cytoplasm is presented. The statistical property of the fluctuations of the anomalous-diffusion exponent is also discussed based on a maximum-entropy-principle approach. In addition, an issue regarding the continuum limit of the en...
Some approaches to the covering information entropy and some definitions of orderings and quasi–orderings of coverings will be described, generalizing the case of the partition entropy and ordering. The aim is to extend to covering the general result of anti–tonicity (strictly decreasing monotonicity) of partition entropy. In particular an entropy in the case of incomplete information systems i...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید