نتایج جستجو برای: shannon entropy numerical simulation

تعداد نتایج: 876050  

Journal: :Int. J. Software and Informatics 2007
Daoqiang Zhang Songcan Chen Zhi-Hua Zhou

In this paper, the well-known competitive clustering algorithm (CA) is revisited and reformulated from a point of view of entropy minimization. That is, the second term of the objective function in CA can be seen as quadratic or second-order entropy. Along this novel explanation, two generalized competitive clustering algorithms inspired by Renyi entropy and Shannon entropy, i.e. RECA and SECA,...

Journal: :CoRR 2010
Ping Li

Abstract The long-standing problem of Shannon entropy estimation in data streams (assuming the strict Turnstile model) is now an easy task by using the technique proposed in this paper. Essentially speaking, in order to estimate the Shannon entropy with a guaranteed ν-additive accuracy, it suffices to estimate the αth frequency moment, where α = 1−∆, with a guaranteed ǫ-multiplicative accuracy,...

1998
H Herzel

The order-q Tsallis (Hq ) and Rényi entropy (Kq ) receive broad applications in the statistical analysis of complex phenomena. A generic problem arises, however, when these entropies need to be estimated from observed data. The finite size of data sets can lead to serious systematic and statistical errors in numerical estimates. In this paper, we focus upon the problem of estimating generalized...

Journal: :Neuro endocrinology letters 2013
Taiki Takahashi

Connections between information theory and decision under uncertainty have been attracting attention in econophysics, neuroeconomics and quantum decision theory. This paper proposes a psychophysical theory of Shannon entropy based on a mathematical equivalence of delay and uncertainty in decision-making, and psychophysics of the perception of waiting time in probabilistic choices. Furthermore, ...

2013
Edin Mulalić Miomir Stanković Radomir Stanković

The Tsallis entropy was proposed as a possible generalization of the standard Boltzmann-Gibbs-Shannon (BGS) entropy as a concept aimed at efficient characterisation of non-extensive complex systems. Ever since its introduction [1], it has been successfully applied in various fields [2]. In parallel, there have been numerous attempts to provide its formal derivation from an axiomatic foundation,...

Journal: :CoRR 2016
Nithin Nagaraj Karthi Balasubramanian

Shannon Entropy has been extensively used for characterizing complexity of time series arising from chaotic dynamical systems and stochastic processes such as Markov chains. However, for short and noisy time series, Shannon entropy performs poorly. Complexity measures which are based on lossless compression algorithms are a good substitute in such scenarios. We evaluate the performance of two s...

2009
Ping Li

Compressed Counting (CC) [22] was recently proposed for estimating the αth frequency moments of data streams, where 0 < α ≤ 2. CC can be used for estimating Shannon entropy, which can be approximated by certain functions of the αth frequency moments as α → 1. Monitoring Shannon entropy for anomaly detection (e.g., DDoS attacks) in large networks is an important task. This paper presents a new a...

2007
Danielle Sent Linda C. van der Gaag

In diagnostic decision-support systems, test selection amounts to selecting, in a sequential manner, a test that is expected to yield the largest decrease in the uncertainty about a patient’s diagnosis. For capturing this uncertainty, often an information measure is used. In this paper, we study the Shannon entropy, the Gini index, and the misclassification error for this purpose. We argue that...

2009
Marcelo R. Ubriaco

We propose entropy functions based on fractional calculus. We show that this new entropy has the same properties than the Shannon entropy except that of additivity, therefore making this entropy non-extensive. We show that this entropy function satisfies the Lesche and thermodynamic stability criteria.

2003
B. H. Lavenda

The Tsallis entropy is shown to be an additive entropy of degree-q that information scientists have been using for almost forty years. Neither is it a unique solution to the nonadditive functional equation from which random entropies are derived. Notions of additivity, extensivity and homogeneity are clarified. The relation between mean code lengths in coding theory and various expressions for ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید