نتایج جستجو برای: modified shannon entropy
تعداد نتایج: 321667 فیلتر نتایج به سال:
Chaos is often explained in terms of random behaviour; and having positive Kolmogorov–Sinai entropy (KSE) is taken to be indicative of randomness. Although seemly plausible, the association of positive KSE with random behaviour needs justification since the definition of the KSE does not make reference to any notion that is connected to randomness. A common way of justifying this use of the KSE...
If optimality is measured by average codeword length, Huffman's algorithm gives optimal codes, and the redundancy can be measured as the difference between the average codeword length and Shannon's entropy. If the objective function is replaced by an exponentially weighted average, then a simple modification of Huffman's algorithm gives optimal codes. The redundancy can now be measured as the d...
Based on the local Shannon entropy concept in information theory, a new measure of aromaticity is introduced. This index, which describes the probability of electronic charge distribution between atoms in a given ring, is called Shannon aromaticity (SA). Using B3LYP method and different basis sets (6-31G**, 6-31+G** and 6-311++G**), the SA values of some five-membered heterocycles, C(4)H(4)X, a...
We mathematically explore a model for the shortness and security for passwords that are stored in hashed form. The model is implicitly in the NIST publication [8] and is based on conditions of the Shannon, Guessing and Min Entropy. In addition we establish various new relations between these three notions of entropy, providing strong improvements on existing bounds such as the McEliece-Yu bound...
The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam’s inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The CramérRao inequality is a direct consequence of these two inequalities. In this paper the inequalities above are ...
In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation has been supported for the optimal performanc...
We say that two free p.m.p. actions of countable groups are Shannon orbit equivalent if there is an equivalence between them whose associated cocycle partitions have finite entropy. show the acting sofic and each has a w-normal amenable subgroup which neither locally nor virtually cyclic then implies same maximum This extends result Austin beyond finitely generated setting consequence Bernoulli...
Abstract Hidden Markov chains are widely applied statistical models of stochastic processes, from fundamental physics and chemistry to finance, health, artificial intelligence. The hidden processes they generate notoriously complicated, however, even if the chain is finite state: no expression for their Shannon entropy rate exists, as set predictive features generically infinite. As such, date ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید