نتایج جستجو برای: shannons entropy rate

تعداد نتایج: 1019691  

Journal: :Journal of Physics A: Mathematical and General 2004

Journal: :Physical Communication 2016
Marco Martalò Riccardo Raheli

This paper discusses and analyzes various models of binary correlated sources, which may be relevant in several distributed communication scenarios. These models are statistically characterized in terms of joint Probability Mass Function (PMF) and covariance. Closed-form expressions for the joint entropy of the sources are also presented. The asymptotic entropy rate for very large number of sou...

The entropy generation analysis of non-Newtonian fluid in rotational flow between two concentric cylinders is examined when the outer cylinder is fixed and the inner cylinder is revolved with a constant angular speed. The viscosity of non-Newtonian fluid is considered at the same time interdependent on temperature and shear rate. The Nahme law and Carreau equation are used to modeling dependenc...

Journal: :IEEE Trans. Information Theory 1999
Jun Muramatsu Fumio Kanaya

Source coding theorems for general sources are presented. For a source , which is assumed to be a probability measure on all strings of infinite-length sequence with a finite alphabet, the notion of almost-sure sup entropy rate is defined; it is an extension of the Shannon entropy rate. When both an encoder and a decoder know that a sequence is generated by , the following two theorems can be p...

2015
Timo Mulder Jorn Peters

The entropy rate of independent and identically distributed events can on average be encoded by H(X) bits per source symbol. However, in reality, series of events (or processes) are often randomly distributed and there can be arbitrary dependence between each event. Such processes with arbitrary dependence between variables are called stochastic processes. This report shows how to calculate the...

2002
Dmitriy Genzel Eugene Charniak

We present a constancy rate principle governing language generation. We show that this principle implies that local measures of entropy (ignoring context) should increase with the sentence number. We demonstrate that this is indeed the case by measuring entropy in three different ways. We also show that this effect has both lexical (which words are used) and non-lexical (how the words are used)...

Journal: :Entropy 2016
David Darmon

We introduce a method for quantifying the inherent unpredictability of a continuous-valued time series via an extension of the differential Shannon entropy rate. Our extension, the specific entropy rate, quantifies the amount of predictive uncertainty associated with a specific state, rather than averaged over all states. We relate the specific entropy rate to popular ‘complexity’ measures such...

, ,

A compact fin-tube heat exchanger is used to transfer current fluid heat inside the tubes into the air outside. In this study, entropy production and optimized Reynolds number for finned-tube heat exchangers based on the minimum entropy production have been investigated. As a result, the total entropy of compact heat exchangers, which is the summation of the production rate of fluid entropy ins...

Journal: :IEEE Trans. Information Theory 2014
Christoph Bunte Amos Lapidoth

A task is randomly drawn from a finite set of tasks and is described using a fixed number of bits. All the tasks that share its description must be performed. Upper and lower bounds on the minimum ρ-th moment of the number of performed tasks are derived. The case where a sequence of tasks is produced by a source and n tasks are jointly described using nR bits is considered. If R is larger than ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید