نتایج جستجو برای: and 054 disregarding shannon entropy
تعداد نتایج: 16840017 فیلتر نتایج به سال:
Shannon information entropy is a natural measure of probability (de)localization and thus (un)predictability in various procedures of data analysis for model systems. We pay particular attention to links between the Shannon entropy and the related Fisher information notion, which jointly account for the shape and extension of continuous probability distributions. Classical, dynamical and random...
In this paper, we introduce a goodness of fit test for expo- nentiality based on Lin-Wong divergence measure. In order to estimate the divergence, we use a method similar to Vasicek’s method for estimat- ing the Shannon entropy. The critical values and the powers of the test are computed by Monte Carlo simulation. It is shown that the proposed test are competitive with other tests of exponentia...
Background and Aim: The content of the hygiene education and health promotion course in schools is crucial for raising the awareness of, and development of hygiene culture in, pupils. In this study we aimed to develop a hygiene education course with a fully suitable content. Materials and Methods: In this research the content analysis technic using the "Shannon Entropy" method was used, in w...
If optimality is measured by average codeword length, Huffman's algorithm gives optimal codes, and the redundancy can be measured as the difference between the average codeword length and Shannon's entropy. If the objective function is replaced by an exponentially weighted average, then a simple modification of Huffman's algorithm gives optimal codes. The redundancy can now be measured as the d...
We examine the entropy of stationary nonequilibrium measures of boundary driven symmetric simple exclusion processes. In contrast with the Gibbs–Shannon entropy [1, 10], the entropy of nonequilibrium stationary states differs from the entropy of local equilibrium states.
The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam’s inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The CramérRao inequality is a direct consequence of these two inequalities. In this paper the inequalities above are ...
We mathematically explore a model for the shortness and security for passwords that are stored in hashed form. The model is implicitly in the NIST publication [8] and is based on conditions of the Shannon, Guessing and Min Entropy. In addition we establish various new relations between these three notions of entropy, providing strong improvements on existing bounds such as the McEliece-Yu bound...
In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation has been supported for the optimal performanc...
Based on the local Shannon entropy concept in information theory, a new measure of aromaticity is introduced. This index, which describes the probability of electronic charge distribution between atoms in a given ring, is called Shannon aromaticity (SA). Using B3LYP method and different basis sets (6-31G**, 6-31+G** and 6-311++G**), the SA values of some five-membered heterocycles, C(4)H(4)X, a...
The introduction of the Rényi entropy allowed a generalization of the Shannon entropy and unified its notion with that of other entropies. However, so far there is no generally accepted conditional version of the Rényi entropy corresponding to the one of the Shannon entropy. Different definitions proposed so far in the literature lacked central and natural properties one way or another. In this...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید