نتایج جستجو برای: and 054 disregarding shannon entropy

تعداد نتایج: 16840017  

Shannon entropy is increasingly used in many applications. In this article, an estimator of the entropy of a continuous random variable is proposed. Consistency and scale invariance of variance and mean squared error of the proposed estimator is proved and then comparisons are made with Vasicek's (1976), van Es (1992), Ebrahimi et al. (1994) and Correa (1995) entropy estimators. A simulation st...

Journal: :International Journal of Mathematics and Mathematical Sciences 2005

Journal: :CoRR 2008
Shigeru Furuichi

Shannon entropy [1] is one of fundamental quantities in classical information theory and uniquely determinded by the Shannon-Khinchin axiom or the Faddeev axiom. Oneparameter extensions for Shannon entropy have been studied by many researchers [2]. The Rényi entropy [3] and the Tsallis entropy [4] are famous. In the paper [5], the uniqueness theorem for the Tsallis entropy was proven. See also ...

2000
A. Zeilinger Michael J. W. Hall

It is pointed out that the case for Shannon entropy and von Neumann entropy, as measures of uncertainty in quantum mechanics, is not as bleak as suggested in quant-ph/0006087. The main argument of the latter is based on one particular interpretation of Shannon’s H-function (related to consecutive measurements), and is shown explicitly to fail for other physical interpretations. Further, it is s...

In this paper, we obtain the Rényi entropy rate for irreducible-aperiodic Markov chains with countable state space, using the theory of countable nonnegative matrices. We also obtain the bound for the rate of Rényi entropy of an irreducible Markov chain. Finally, we show that the bound for the Rényi entropy rate is the Shannon entropy rate.

Journal: :CoRR 2005
Ambedkar Dukkipati M. Narasimha Murty Shalabh Bhatnagar

By replacing linear averaging in Shannon entropy with Kolmogorov-Nagumo average (KN-averages) or quasilinear mean and further imposing the additivity constraint, Rényi proposed the first formal generalization of Shannon entropy. Using this recipe of Rényi, one can prepare only two information measures: Shannon and Rényi entropy. Indeed, using this formalism Rényi characterized these additive en...

2008

5 Information 15 5.1 Information As Surprise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 5.2 Average Shannon Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 5.3 How Surprised Should You Be? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 5.4 Entropy As Average Shannon Information . . . . . . . . . . . . . . . . . . . . . . 20 5.5 Dicing...

2010
E. Bouhova-Thacker

This memo contains proofs that the Shannon entropy is the limiting case of both the Renyi entropy and the Tsallis entropy, or information. These results are also confirmed experimentally. We conclude with some general observations on the utility of entropy measures. A brief summary of the origins of the concept of physical entropy are provided in an appendix.

Journal: :Open Systems & Information Dynamics 2003

Selecting appropriate inputs for intelligent models is important due to reduce costs and save time and increase accuracy and efficiency of models. The purpose of this study is using Shannon entropy to select the optimum combination of input variables in time series modeling. Monthly time series of precipitation, temperature and radiation in the period of 1982-2010 was used from Tabriz synoptic ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید