نتایج جستجو برای: modified shannon entropy

تعداد نتایج: 321667  

Introduction: English language teaching curriculum is very important in effective teaching and learning of students. In order to pay attention to the importance of teaching English as one of the most important communication tools, it is necessary to develop a curriculum that can accommodate all the necessary English language teaching needs. Therefore, the purpose of this study is to analyze t...

2013
Peter Clifford Ioana Cosma

We consider the problem of approximating the empirical Shannon entropy of a highfrequency data stream under the relaxed strict-turnstile model, when space limitations make exact computation infeasible. An equivalent measure of entropy is the Rényi entropy that depends on a constant α. This quantity can be estimated efficiently and unbiasedly from a low-dimensional synopsis called an α-stable da...

2015
Guy Jumarie

By combining the explicit formula of the Shannon informational entropy ) , ( Y X H for two random variables X and Y , with the entropy )) ( ( X f H of ) (X f where (.) f is a real-valued differentiable function, we have shown that the density of the amount of information in Shannon sense involved in a non-random differentiable function is defined by the logarithm of the absolute value of its de...

Journal: :CoRR 2017
Maciej Skorski

The weak law of large numbers implies that, under mild assumptions on the source, the Renyi entropy per produced symbol converges (in probability) towards the Shannon entropy rate. This paper quantifies the speed of this convergence for sources with independent (but not iid) outputs, generalizing and improving the result of Holenstein and Renner (IEEE Trans. Inform. Theory, 2011). (a) we charac...

Journal: :Advances in Complex Systems 2002
Marcelo A. Montemurro Damián H. Zanette

Beyond the local constraints imposed by grammar, words concatenated in long sequences carrying a complex message show statistical regularities that may reflect their linguistic role in the message. In this paper, we perform a systematic statistical analysis of the use of words in literary English corpora. We show that there is a quantitative relation between the role of content words in literar...

Journal: :CoRR 2017
Shanyun Liu Rui She Jiaxun Lu Pingyi Fan

Shannon entropy is the most crucial foundation of Information Theory, which has been proven to be effective in many fields such as communications. Rényi entropy and Chernoff information are other two popular measures of information with wide applications. The mutual information is effective to measure the channel information for the fact that it reflects the relation between output variables an...

2015
Cafer Caferov Baris Kaya Ryan O'Donnell A. C. Cem Say

Let p be an unknown probability distribution on [n] := {1, 2, . . . n} that we can access via two kinds of queries: A SAMP query takes no input and returns x ∈ [n] with probability p[x]; a PMF query takes as input x ∈ [n] and returns the value p[x]. We consider the task of estimating the entropy of p to within ±∆ (with high probability). For the usual Shannon entropy H(p), we show that Ω(log n/...

2009
Peter Clifford Ioana Ada Cosma

We consider the problem of approximating the empirical Shannon entropy of a high-frequency data stream when space limitations make exact computation infeasible. It is known that αdependent quantities such as the Rényi and Tsallis entropies can be estimated efficiently and unbiasedly from low-dimensional α-stable data sketches. An approximation to the Shannon entropy can be obtained from either ...

2003
Dan A. Simovici Szymon Jaroszewicz

We introduce an extension of the notion of Shannon conditional entropy to a more general form of conditional entropy that captures both the conditional Shannon entropy and a similar notion related to the Gini index. The proposed family of conditional entropies generates a collection of metrics over the set of partitions of finite sets, which can be used to construct decision trees. Experimental...

2002
Dan A. Simovici Szymon Jaroszewicz

We introduce an extension of the notion of Shannon conditional entropy to a more general form of conditional entropy that captures both the conditional Shannon entropy and a similar notion related to the Gini index. The proposed family of conditional entropies generates a collection of metrics over the set of partitions of finite sets, which can be used to construct decision trees. Experimental...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید