نتایج جستجو برای: tsallis entropy
تعداد نتایج: 65979 فیلتر نتایج به سال:
Sparsity and entropy are pillar notions of modern theories in signal processing and information theory. However, there is no clear consensus among scientists on the characterization of these notions. Previous efforts have contributed to understand individually sparsity or entropy from specific research interests. This paper proposes a mathematical formalism, a joint axiomatic characterization, ...
We derive a dual-primal recursive algorithm based on the Fenchel duality framework, extending Dykstra’s successive projections and Csiszar’s I-projections schemes, to handle Tsallis MaxEnt. The Tsallis entropy Sq(p) is a one-parameter extension of Shannon’s entropyH(p) in the sense that Sq→1(p) = H(p). The solution of the Tsallis MaxEnt falls under a q-deformed Gibbs distribution which is a pow...
A maximum entropy copula is the copula associated with the joint distribution, with prescribed marginal distributions on [0, 1], which maximizes the Tsallis–Havrda–Chavát entropy with q = 2. We find necessary and sufficient conditions for each maximum entropy copula to be a copula in the class introduced in Rodríguez-Lallena and Úbeda-Flores (2004), and we also show that each copula in that cla...
We show that Tsallis’ distributions can be derived from the standard (Shannon) maximum entropy setting, by incorporating a constraint on the divergence between the distribution and another distribution imagined as its tail. In this setting, we find an underlying entropy which is the Rényi entropy. Furthermore, escort distributions and generalized means appear as a direct consequence of the cons...
Spin relaxation close to the glass temperature of CuMn and AuFe spin glasses is shown, by neutron spin echo, to follow a generalized exponential function which explicitly introduces hierarchically constrained dynamics and macroscopic interactions. The interaction parameter is directly related to the normalized Tsallis nonextensive entropy parameter q and exhibits universal scaling with reduced ...
The construction of efficient and effective decision trees remains a key topic in machine learning because of their simplicity and flexibility. A lot of heuristic algorithms have been proposed to construct nearoptimal decision trees. Most of them, however, are greedy algorithms that have the drawback of obtaining only local optimums. Besides, conventional split criteria they used, e.g. Shannon ...
We discuss a one-parameter family of generalized cross entropy between two distributions with the power index, called the projective power entropy. The cross entropy is essentially reduced to the Tsallis entropy if two distributions are taken to be equal. Statistical and probabilistic properties associated with the projective power entropy are extensively investigated including a characterizati...
Owing to considering the distribution of the gray information and the spatial neighbor information with using the two-dimensional (2-D) histogram of the image, The 2-D maximum Tsallis entropy(2DMTE) method often gets better segmentation results, and owing to a controllable parameter, it has better flexibility than other 2-D entropy methods. However, its performance is sensitive to its parameter...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید