نتایج جستجو برای: tsallis entropy

تعداد نتایج: 65979  

2007
Ambedkar Dukkipati Shalabh Bhatnagar M. Narasimha Murty

Shannon entropy of a probability measure P , defined as − ∫ X dP dμ ln dP dμ dμ on a measure space (X,M, μ), is not a natural extension from the discrete case. However, maximum entropy (ME) prescriptions of Shannon entropy functional in the measure-theoretic case are consistent with those for the discrete case. Also it is well known that Kullback-Leibler relative entropy can be extended natural...

Journal: :IMA J. Math. Control & Information 2015
Marek Smieja

The definition of weighted entropy allows for easy calculation of the entropy of the mixture of measures. In this paper we investigate the problem of equivalent definition of the general entropy function in weighted form. We show that under reasonable condition, which is satisfied by the well-known Shannon, Rényi and Tsallis entropies, every entropy function can be defined equivalently in the w...

2008
Ambedkar Dukkipati Shalabh Bhatnagar

As additivity is a characteristic property of the classical information measure, Shannon entropy, pseudo-additivity of the form x+qy = x+y+(1−q)xy is a characteristic property of Tsallis entropy. Rényi in [1] generalized Shannon entropy by means of Kolmogorov-Nagumo averages, by imposing additivity as a constraint. In this paper we show that there exists no generalization for Tsallis entropy, b...

Journal: :Physical review. E, Statistical, nonlinear, and soft matter physics 2002
Sumiyoshi Abe

The q-exponential distributions, which are generalizations of the Zipf-Mandelbrot power-law distribution, are frequently encountered in complex systems at their stationary states. From the viewpoint of the principle of maximum entropy, they can apparently be derived from three different generalized entropies: the Rényi entropy, the Tsallis entropy, and the normalized Tsallis entropy. Accordingl...

2013
Edin Mulalić Miomir Stanković Radomir Stanković

The Tsallis entropy was proposed as a possible generalization of the standard Boltzmann-Gibbs-Shannon (BGS) entropy as a concept aimed at efficient characterisation of non-extensive complex systems. Ever since its introduction [1], it has been successfully applied in various fields [2]. In parallel, there have been numerous attempts to provide its formal derivation from an axiomatic foundation,...

Journal: :Entropy 2011
John C. Baez Tobias Fritz Tom Leinster

There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the “information loss”, or change in entropy, associated with a measure-preser...

Journal: :CoRR 2005
Ambedkar Dukkipati M. Narasimha Murty Shalabh Bhatnagar

By replacing linear averaging in Shannon entropy with Kolmogorov-Nagumo average (KN-averages) or quasilinear mean and further imposing the additivity constraint, Rényi proposed the first formal generalization of Shannon entropy. Using this recipe of Rényi, one can prepare only two information measures: Shannon and Rényi entropy. Indeed, using this formalism Rényi characterized these additive en...

Journal: :CoRR 2003
Garimella Rama Murthy

In this research paper, it is proved that an approximation to Gibbs-Shannon entropy measure naturally leads to Tsallis entropy for the real parameter q =2 . Several interesting measures based on the input as well as output of a discrete memoryless channel are provided and some of the properties of those measures are discussed. It is expected that these results will be of utility in Information ...

2016
Dmitry S. Shalymov

In this paper a non-stationary processes that tend to maximize the Tsallis entropy are considered. Systems with discrete probability distribution for the Tsallis entropy have already been investigated on the basis of the Speed-Gradient principle. The evolution of probability density function and continuous form of the Tsallis entropy are considered. A set of equations describing dynamics of a s...

2001
A. Bezerianos S. Tong Y. Zhu N. Thakor

In this paper, we introduce Nonadditive Information Theory through the axiomatic formulation of Tsallis entropy. We show that systems with transitions from high dimensionality to few degrees of freedom are better described by nonadditive formalism. Such a biological system is the brain and brain rhythms is its macroscopic dynamic trace. We will show with simulations that Tsallis entropy is a po...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید