نتایج جستجو برای: shannons entropy
تعداد نتایج: 65347 فیلتر نتایج به سال:
We give a counterexample to the vector generalization of Costa’s entropy power inequality (EPI) due to Liu, Liu, Poor and Shamai. In particular, the claimed inequality can fail if the matix-valued parameter in the convex combination does not commute with the covariance of the additive Gaussian noise. Conversely, the inequality holds if these two matrices commute. For a random vector X with dens...
A task is randomly drawn from a finite set of tasks and is described using a fixed number of bits. All the tasks that share its description must be performed. Upper and lower bounds on the minimum ρ-th moment of the number of performed tasks are derived. The case where a sequence of tasks is produced by a source and n tasks are jointly described using nR bits is considered. If R is larger than ...
We investigate how information leakage reduces computational entropy of a random variable X. Recall that HILL and metric computational entropy are parameterized by quality (how distinguishable is X from a variable Z that has true entropy) and quantity (how much true entropy is there in Z). We prove an intuitively natural result: conditioning on an event of probability p reduces the quality of m...
We investigate how information leakage reduces computational entropy of a random variable X. Recall that HILL and metric computational entropy are parameterized by quality (how distinguishable is X from a variable Z that has true entropy) and quantity (how much true entropy is there in Z). We prove an intuitively natural result: conditioning on an event of probability p reduces the quality of m...
This paper presents an application of Approximate Entropy (ApEn) and Sample Entropy (SampEn) in the analysis of heart rhythm, blood pressure and stroke volume for the diagnosis of vasovagal syndrome. The analyzed biosignals were recorded during positive passive tilt tests—HUTT(+). Signal changes and their entropy were compared in three main phases of the test: supine position, tilt, and pre-syn...
In this paper, we have developed conditions under which the entropy function and the residual entropy function characterize the distribution. We have also studied some stochastic comparisons based on the entropy measure and established relations between entropy comparisons and comparisons with respect to other measures in reliability. Conditions for decreasing (increasing) uncertainty in a resi...
The concept of uncertain entropy is used to provide a quantitative measurement of the uncertainty associated with uncertain variables. After introducing the definition, this paper gives some examples of entropy of uncertain variables. Furthermore this paper proposes the maximum entropy principle for uncertain variables, that is, out of all the uncertainty distributions satisfying given constrai...
We present a simple proof of the entropy-power inequality using an optimal transportation argument which takes the form of a simple change of variables. The same argument yields a reverse inequality involving a conditional differential entropy which has its own interest. For each inequality, the equality case is easily captured by this method and the proof is formally identical in one and sever...
Article history: Received 22 December 2006 Received in revised form 7 March 2008 Accepted 19 March 2008
Information entropy and its extension, which are important generalizations of entropy, are currently applied to many research domains. In this paper, a novel generalized relative entropy is constructed to avoid some defects of traditional relative entropy. We present the structure of generalized relative entropy after the discussion of defects in relative entropy. Moreover, some properties of t...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید