نتایج جستجو برای: information entropy theory

تعداد نتایج: 1867273  

2000
M. S. GŁOWACKI

In the contemporary world the information theory is more and more widerly used, that is why its elements should be included in the physics education on higher level, especially as far as the connection between this theory and physics measurements is concerned.The following paper consists of the methodological propositions of those elements.The information theory equations presented in the first...

1997
Vincent van de Laar W. Bastiaan Kleijn Ed F. Deprettere

We estimated the perceptual entropy rate of the phonemes of American English and found that the upper limit of the perceptual entropy of voiced phonemes is approximately 1.4 bit/sample, whereas the perceptual entropy of unvoiced phonemes is approximately 0.9 bit/sample. Results indicate that a simple voiced/unvoiced classi cation is suboptimal when trying to minimize bit rate. We used two di er...

2005
Jacques Calmet Xavier Calmet

Differential entropy is the entropy of a continuous random variable. It is related to the shortest description length and thus similar to the entropy of a discrete random variable. A basic introduction can be found in the book of Cover and Thomas [1]. In this paper, we are interested in the concept of shortest description length. Indeed, in a recent paper [2], we have investigated the case of s...

1985
Roberto Luzzi Áurea R. Vasconcellos

Some general considerations on the notion of entropy in physics are presented. An attempt is made to clarify the question of the differentiation between physical entropy (the Clausius-Boltzmann one) and quantities called entropies associated to Information Theory, which are in fact generating functionals for the derivation of probability distributions and not thermodynamic functions of state. T...

1992

Neural encoding and decoding focus on the question: " What does the response of a neuron tell us about a stimulus ". In this chapter we consider a related but different question: " How much does the neural response tell us about a stimulus ". The techniques of information theory allow us to answer this question in a quantitative manner. Furthermore, we can use them to ask what forms of neural r...

Journal: :journal of industrial engineering, international 2011
n javid a makui

this paper assumes the cell formation problem as a distributed decision network. it proposes an approach based on application and extension of information theory concepts, in order to analyze informational complexity in an agent- based system, due to interdependence between agents. based on this approach, new quantitative concepts and definitions are proposed in order to measure the amount of t...

Journal: :Entropy 2013
Rongxi Zhou Ru Cai Guanqun Tong

Although the concept of entropy is originated from thermodynamics, its concepts and relevant principles, especially the principles of maximum entropy and minimum cross-entropy, have been extensively applied in finance. In this paper, we review the concepts and principles of entropy, as well as their applications in the field of finance, especially in portfolio selection and asset pricing. Furth...

Journal: :CoRR 2009
Fabio G. Guerrero

A simple method for finding the entropy and redundancy of a reasonable long sample of English text by direct computer processing and from first principles according to Shannon theory is presented. As an example, results on the entropy of the English language have been obtained based on a total of 20.3 million characters of written English, considering symbols from one to five hundred characters...

2012
Weiping Tang Weina Gao

Entropy is a measure of the uncertainty associated with an uncertain variable. So far, entropy, quadratic entropy and sine entropy have been proposed for uncertain variables. This paper will propose a triangular entropy for uncertain variables, and verify its properties such as translation invariance and positive linearity.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید