نتایج جستجو برای: rényi
تعداد نتایج: 6772 فیلتر نتایج به سال:
The Rényi entropy is a generalization of Shannon entropy to a one-parameter family of entropies. Tsallis entropy too is a generalization of Shannon entropy. The measure for Tsallis entropy is non-logarithmic. After the introduction of Shannon entropy , the conditional Shannon entropy was derived and its properties became known. Also, for Tsallis entropy, the conditional entropy was introduced a...
This paper introduces “swiveled Rényi entropies” as an alternative to the Rényi entropic quantities put forward in [Berta et al., Physical Review A 91, 022333 (2015)]. What distinguishes the swiveled Rényi entropies from the prior proposal of Berta et al. is that there is an extra degree of freedom: an optimization over unitary rotations with respect to particular fixed bases (swivels). A conse...
Recently a new quantum generalization of the Rényi divergence and the corresponding conditional Rényi entropies was proposed. Here we report on a surprising relation between conditional Rényi entropies based on this new generalization and conditional Rényi entropies based on the quantum relative Rényi entropy that was used in previous literature. This generalizes the well-known duality relation...
Two different distributions may have equal Rényi entropy; thus a distribution cannot be identified by its Rényi entropy. In this paper, we explore properties of the Rényi entropy of order statistics. Several characterizations are established based on the Rényi entropy of order statistics and record values. These include characterizations of a distribution on the basis of the differences between...
Rényi entropy of order α is a general measure of entropy. In this paper we derive estimations for the Rényi entropy of the mixture of sources in terms of the entropy of the single sources. These relations allow to compute the Rényi entropy dimension of arbitrary order of a mixture of measures. The key for obtaining these results is our new definition of the weighted Rényi entropy. It is shown t...
Rényi divergence is related to Rényi entropy much like information divergence (also called Kullback-Leibler divergence or relative entropy) is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as information divergence. We review the most important properties of Rényi divergence, including it...
Rényi entropy and Rényi divergence evidence a long track record of usefulness in information theory and its applications. Alfred Rényi never got around to generalizing mutual information in a similar way. In fact, in the literature there are several possible ways to accomplish such generalization, most notably those suggested by Suguru Arimoto, Imre Csiszár, and Robin Sibson. We collect several...
The squashed entanglement quantifies the amount of entanglement in a bipartite quantum state, and it satisfies all of the axioms desired for an entanglement measure. The quantum discord is a measure of quantum correlations that are different from those due to entanglement. What these two measures have in common is that they are both based upon the conditional quantum mutual information. In [Ber...
Shannon and Rényi information theory have been applied to coupling estimation in complex systems using time series of their dynamical states. By analysing how information is transferred between constituent parts of a complex system, it is possible to infer the coupling parameters of the system. To this end, we introduce the partial Rényi transfer entropy and we give an alternative derivation of...
The conventional channel resolvability problem refers to the determination of the minimum rate needed for an input process to approximate the output distribution of a channel in either the total variation distance or the relative entropy. In contrast to previous works, in this paper, we use the (normalized or unnormalized) Rényi divergence (with the Rényi parameter in [0,2]) to measure the leve...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید