نتایج جستجو برای: shannon entropy numerical simulation

تعداد نتایج: 876050  

1994
David H. Wolpert David R. Wolf

This paper is the first of two on the problem of estimating a function of a probability distribution from a finite set of samples of that distribution. In this paper a Bayesian analysis of this problem is presented, the optimal properties of the Bayes estimators are discussed, and as an example of the formalism, closed form expressions for the Bayes estimators for the moments of the Shannon ent...

2008
David H. Wolpert David R. Wolf

Abstract: This paper is the first of two on the problem of estimating a function of a probability distribution from a finite set of samples of that distribution. In this paper a Bayesian analysis of this problem is presented, the optimal properties of the Bayes estimators are discussed, and as an example of the formalism, closed form expressions for the Bayes estimators for the moments of the S...

2015
Maciej Skorski

We provide a new inequality that links two important entropy notions: Shannon Entropy H1 and collision entropy H2. Our formula gives the worst possible amount of collision entropy in a probability distribution, when its Shannon Entropy is fixed. While in practice it is easier to evaluate Shannon entropy than other entropy notions, it is well known in folklore that it does not provide a good est...

2015
Young-Seok Choi

We presents a refined multiscale Shannon entropy for analyzing electroencephalogram (EEG), which reflects the underlying dynamics of EEG over multiple scales. The rationale behind this method is that neurological signals such as EEG possess distinct dynamics over different spectral modes. To deal with the nonlinear and nonstationary nature of EEG, the recently developed empirical mode decomposi...

Journal: :CoRR 2009
Ping Li

The Shannon entropy is a widely used summary statistic, for example, network traffic measurement, anomaly detection, neural computations, spike trains, etc. This study focuses on estimating Shannon entropy of data streams. It is known that Shannon entropy can be approximated by Rényi entropy or Tsallis entropy, which are both functions of the αth frequency moments and approach Shannon entropy a...

Journal: :Statistical applications in genetics and molecular biology 2011
Mariza de Andrade Xin Wang

In the past few years, several entropy-based tests have been proposed for testing either single SNP association or gene-gene interaction. These tests are mainly based on Shannon entropy and have higher statistical power when compared to standard χ2 tests. In this paper, we extend some of these tests using a more generalized entropy definition, Rényi entropy, where Shannon entropy is a special c...

Journal: :Entropy 2014
Jikai Chen Guoqing Li

As a novel data mining approach, a wavelet entropy algorithm is used to perform entropy statistics on wavelet coefficients (or reconstructed signals) at various wavelet scales on the basis of wavelet decomposition and entropy statistic theory. Shannon wavelet energy entropy, one kind of wavelet entropy algorithm, has been taken into consideration and utilized in many areas since it came into be...

2001
S. Martin G. Morison W. Nailon T. Durrani

The Tsallis measure of mutual information is combined with the simultaneous perturbation stochastic approximation algorithm to register images. It is shown that Tsallis entropy can improve registration accuracy and speed of convergence, compared with Shannon entropy, in the calculation of mutual information. Simulation results show that the new algorithm achieves up to seven times faster conver...

Journal: :Entropy 2015
Hsuan Tung Peng Yew Kam Ho

We study the correlation of the ground state of an N-particle Moshinsky model by computing the Shannon entropy in both position and momentum spaces. We have derived the Shannon entropy and mutual information with analytical forms of such an N-particle Moshinsky model, and this helps us test the entropic uncertainty principle. The Shannon entropy in position space decreases as interaction streng...

Journal: :Int. J. Math. Mathematical Sciences 2005
C. G. Chakrabarti Indranil Chakrabarty

We have presented a new axiomatic derivation of Shannon entropy for a discrete probability distribution on the basis of the postulates of additivity and concavity of the entropy function. We have then modified Shannon entropy to take account of observational uncertainty.The modified entropy reduces, in the limiting case, to the form of Shannon differential entropy. As an application, we have de...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید