منابع مشابه
Distribution of Mutual Information
The mutual information of two random variables ı and with joint probabilities {πij} is commonly used in learning Bayesian nets as well as in many other fields. The chances πij are usually estimated by the empirical sampling frequency nij/n leading to a point estimate I(nij/n) for the mutual information. To answer questions like “is I(nij/n) consistent with zero?” or “what is the probability t...
متن کاملOn the Distribution of Mutual Information
In the early years of information theory, mutual information was defined as a random variable, and error probability bounds for communication systems were obtained in terms of its probability distribution. We advocate a return to this perspective for a renewed look at information theory for general channel models and finite coding blocklengths. For capacityachieving inputs, we characterize the ...
متن کاملClustering of a Number of Genes Affecting in Milk Production using Information Theory and Mutual Information
Information theory is a branch of mathematics. Information theory is used in genetic and bioinformatics analyses and can be used for many analyses related to the biological structures and sequences. Bio-computational grouping of genes facilitates genetic analysis, sequencing and structural-based analyses. In this study, after retrieving gene and exon DNA sequences affecting milk yield in dairy ...
متن کاملDistribution of Mutual Information from Complete and Incomplete Data
Mutual information is widely used, in a descriptive way, to measure the stochastic dependence of categorical random variables. In order to address questions such as the reliability of the descriptive value, one must consider sample-to-population inferential approaches. This paper deals with the posterior distribution of mutual information, as obtained in a Bayesian framework by a second-order D...
متن کاملMutual Information and Eigenvalue Distribution of MIMO Ricean Channels
This paper presents an explicit expression for the marginal probability density distribution of the unordered eigenvalues of a noncentral Wishart matrix HH† where H can represent a multiple-input multiple-output channel obeying the Ricean law. By integrating over this marginal density distribution, the corresponding ergodic mutual information is characterized also in explicit form.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Physics Letters A
سال: 2001
ISSN: 0375-9601
DOI: 10.1016/s0375-9601(01)00128-1