نتایج جستجو برای: rényi entropy

تعداد نتایج: 70986  

2015
Mohammad H. Ansari Yuli V. Nazarov

We evaluate Rényi entropy flows from generic quantum heat engines (QHE) to a weakly coupled probe environment kept in thermal equilibrium. We show that the flows are determined not only by heat flow but also by a quantum coherent flow that can be separately measured in experiment apart from the heat flow measurement. The same pertains to Shannon entropy flow. This appeals for a revision of the ...

Journal: :IEEE Trans. Information Theory 2001
Ziad Rached Fady Alajaji L. Lorne Campbell

In this work, we examine the existence and the computation of the Rényi divergence rate, lim ( ), between two time-invariant finite-alphabet Markov sources of arbitrary order and arbitrary initial distributions described by the probability distributions and , respectively. This yields a generalization of a result of Nemetz where he assumed that the initial probabilities under and are strictly p...

2013
Shun Watanabe Masahito Hayashi

This paper investigates the privacy amplification problem, and compares the existing two bounds: the exponential bound derived by one of the authors and the min-entropy bound derived by Renner. It turns out that the exponential bound is better than the min-entropy bound when a security parameter is rather small for a block length, and that the min-entropy bound is better than the exponential bo...

Journal: :Entropy 2018
Renata Rychtáriková Jan Korbel Petr Machácek Dalibor Stys

We introduce novel information-entropic variables—a Point Divergence Gain (Ω α ), a Point Divergence Gain Entropy (Iα), and a Point Divergence Gain Entropy Density (Pα)—which are derived from the Rényi entropy and describe spatio-temporal changes between two consecutive discrete multidimensional distributions. The behavior of Ω α is simulated for typical distributions and, together with Iα and ...

Journal: :Entropy 2016
Renata Rychtáriková Jan Korbel Petr Machácek Petr Císar Jan Urban Dalibor Stys

We generalize the point information gain (PIG) and derived quantities, i.e., point information gain entropy (PIE) and point information gain entropy density (PIED), for the case of the Rényi entropy and simulate the behavior of PIG for typical distributions. We also use these methods for the analysis of multidimensional datasets. We demonstrate the main properties of PIE/PIED spectra for the re...

2015
Jayadev Acharya Alon Orlitsky Ananda Theertha Suresh Himanshu Tyagi

It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p requires Θ(k/ log k) samples, a number that grows near-linearly in the support size. In many applications H(p) can be replaced by the more general Rényi entropy of order α, Hα(p). We determine the number of samples needed to estimate Hα(p) for all α, showing that α < 1 requires a super-linear, r...

2014
Christoph Bunte Amos Lapidoth

A task is randomly drawn from a finite set of tasks and is described using a fixed number of bits. All the tasks that share its description must be performed. Upper and lower bounds on the minimum ρ-th moment of the number of performed tasks are derived. The key is an analog of the Kraft Inequality for partitions of finite sets. When a sequence of tasks is produced by a source of a given Rényi ...

2015
Gauss M. Cordeiro Morad Alizadeh M. H. Tahir M. Mansoor Marcelo Bourguignon G. G. Hamedani

We introduce a new family of continuous models called the beta odd log-logistic generalized family of distributions. We study some of its mathematical properties. Its density function can be symmetrical, left-skewed, right-skewed, reversed-J, unimodal and bimodal shaped, and has constant, increasing, decreasing, upside-down bathtub and J-shaped hazard rates. Five special models are discussed. W...

Journal: :Journal of Mathematical Chemistry 2012

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید