نتایج جستجو برای: information entropy theory

تعداد نتایج: 1867273  

2002
Frédéric Koessler

This paper examines strategic information revelation in a Cournot duopoly with incomplete information about firm 1’s cost and information precision. Firm 2 relies on certifiable and ex post submissions of firm 1, without necessarily knowing whether firm 1 knows its cost or not. The sequential equilibria of the induced communication game are determined for different certifiability possibilities....

Journal: :Entropy 2017
Jung In Seo Yongku Kim

Abstract: In this paper, we provide an entropy inference method that is based on an objective Bayesian approach for upper record values having a two-parameter logistic distribution. We derive the entropy that is based on the i-th upper record value and the joint entropy that is based on the upper record values. Moreover, we examine their properties. For objective Bayesian analysis, we obtain ob...

Journal: :Inf. Sci. 2003
Luc Knockaert

Rényi entropies are compared to generalized log-Fisher information and variational entropies in the context of translation, scale and concentration invariance. It is proved that the Rényi entropies occupy a special place amongst these entropies. It is also shown that Shannon entropy is centrally positioned amidst the Rényi entropies.

2008
Ambedkar Dukkipati Shalabh Bhatnagar

As additivity is a characteristic property of the classical information measure, Shannon entropy, pseudo-additivity of the form x+qy = x+y+(1−q)xy is a characteristic property of Tsallis entropy. Rényi in [1] generalized Shannon entropy by means of Kolmogorov-Nagumo averages, by imposing additivity as a constraint. In this paper we show that there exists no generalization for Tsallis entropy, b...

Journal: :CoRR 2013
Jean-François Bercher

In this communication, we describe some interrelations between generalized q-entropies and a generalized version of Fisher information. In information theory, the de Bruijn identity links the Fisher information and the derivative of the entropy. We show that this identity can be extended to generalized versions of entropy and Fisher information. More precisely, a generalized Fisher information ...

Journal: :Information 2012
Christopher D. Fiorillo

It has been proposed that the general function of the brain is inference, which corresponds quantitatively to the minimization of uncertainty (or the maximization of information). However, there has been a lack of clarity about exactly what this means. Efforts to quantify information have been in agreement that it depends on probabilities (through Shannon entropy), but there has long been a dis...

Journal: :IEEE Trans. Information Theory 1998
Zhen Zhang Raymond W. Yeung

Given n discrete random variables = fX1; ; Xng, associated with any subset of f1; 2; ; ng, there is a joint entropy H(X ) where X = fXi: i 2 g. This can be viewed as a function defined on 2 2; ; ng taking values in [0; +1). We call this function the entropy function of . The nonnegativity of the joint entropies implies that this function is nonnegative; the nonnegativity of the conditional join...

Journal: :Int. J. Semantic Computing 2013
David Ellerman

The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite ...

2015
Gauss M. Cordeiro Morad Alizadeh M. H. Tahir M. Mansoor Marcelo Bourguignon G. G. Hamedani

We introduce a new family of continuous models called the beta odd log-logistic generalized family of distributions. We study some of its mathematical properties. Its density function can be symmetrical, left-skewed, right-skewed, reversed-J, unimodal and bimodal shaped, and has constant, increasing, decreasing, upside-down bathtub and J-shaped hazard rates. Five special models are discussed. W...

Journal: :Physical review. E, Statistical, nonlinear, and soft matter physics 2002
Velimir M. Ilic Miomir S. Stankovic

The form invariance of pseudoadditivity is shown to determine the structure of nonextensive entropies. Nonextensive entropy is defined as the appropriate expectation value of nonextensive information content, similar to the definition of Shannon entropy. Information content in a nonextensive system is obtained uniquely from generalized axioms by replacing the usual additivity with pseudoadditiv...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید