نتایج جستجو برای: s entropy

تعداد نتایج: 771800  

Journal: :Open Syst. Inform. Dynam. 2004
Robert Alicki Mark Fannes

We give an elementary self-contained proof that the minimal entropy output of arbitrary products of channels ρ → 1 d−1 ½−ρ T is additive. This paper is concerned with efficiency of transmission of classical information through a particular quantum channel. Generally, the minimal entropy output of a general quantum channel Γ, given in terms of a completely positive trace-preserving map on the d-...

2009
Richard L. Kaufmann William R. Paterson

[1] Boltzmann’s H function was evaluated using 10 years of 1-min distribution functions. These results were used to study the long-term averaged spatial distributions of four entropy parameters. The average entropy density sa(x), where a = i for ions and a = e for electrons, increased when moving Earthward or toward the flanks. The magnitudes of these entropy changes were similar for ions and e...

Journal: :Entropy 2017
Yaliang Liu Li Zou Yibo Sun Xinhua Yang

An evaluation model of aluminum alloy welded joint low-cycle fatigue data based on information entropy is proposed. Through calculating and analyzing the information entropy of decision attributes, quantitative contribution of stress concentration, plate thickness, and loading mode to the fatigue destruction are researched. Results reveal that the total information entropy of the fatigue data b...

Journal: :IEEE Trans. Information Theory 1988
Anselm Blumer Robert J. McEliece

If optimality is measured by average codeword length, Huffman's algorithm gives optimal codes, and the redundancy can be measured as the difference between the average codeword length and Shannon's entropy. If the objective function is replaced by an exponentially weighted average, then a simple modification of Huffman's algorithm gives optimal codes. The redundancy can now be measured as the d...

2010
Bernd Carl Aicke Hinrichs Alain Pajor

We establish optimal estimates of Gelfand numbers or Gelfand widths of absolutely convex hulls cov(K) of precompact subsets K ⊂ H of a Hilbert space H by the metric entropy of the set K where the covering numbers N(K, ") of K by "-balls of H satisfy the Lorentz condition ∫ ∞ 0 ( log2N(K, ") )r/s d" <∞ for some fixed 0 < r, s ≤ ∞ with the usual modifications in the cases r = ∞, 0 < s < ∞ and 0 <...

Journal: :Entropy 2005
Yi-Fang Chang

Since fluctuations can be magnified due to internal interactions under a certain condition, the equal-probability does not hold. The entropy would be defined as ∑ − = r r r t P t P k t S ) ( ln ) ( ) ( . From this or Ω = ln k S in an internal condensed process, possible decrease of entropy is calculated. Internal interactions, which bring about inapplicability of the statistical independence, c...

پایان نامه :وزارت علوم، تحقیقات و فناوری - دانشگاه صنعتی اصفهان - دانشکده ریاضی 1390

the main objective in sampling is to select a sample from a population in order to estimate some unknown population parameter, usually a total or a mean of some interesting variable. a simple way to take a sample of size n is to let all the possible samples have the same probability of being selected. this is called simple random sampling and then all units have the same probability of being ch...

1999
Shahar Hod

Recently, we derived an improved universal upper bound to the entropy of a charged system S ≤ π(2Eb − q 2)/¯ h. There was, however, some uncertainty in the value of the numerical factor which multiplies the q 2 term. In this paper we remove this uncertainty; we rederive this upper bound from an application of the generalized second law of thermodynamics to a gedanken experiment in which an entr...

2005
G. W. Pratt

Using XMM-Newton observations, we investigate the scaling and structural properties of the ICM entropy in a sample of 10 nearby (z < 0.2) relaxed galaxy clusters in the temperature range 2-9 keV. We derive the local entropy-temperature (S –T ) relation at R = 0.1, 0.2, 0.3 and 0.5R200. The logarithmic slope of the relation is the same within the 1σ error at all scaled radii. However, the intrin...

Journal: :Physical review. E, Statistical, nonlinear, and soft matter physics 2015
E P Bento G M Viswanathan M G E da Luz R Silva

The laws of thermodynamics constrain the formulation of statistical mechanics at the microscopic level. The third law of thermodynamics states that the entropy must vanish at absolute zero temperature for systems with nondegenerate ground states in equilibrium. Conversely, the entropy can vanish only at absolute zero temperature. Here we ask whether or not generalized entropies satisfy this fun...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید