Information Theory 4.1 Entropy and Mutual Information
ثبت نشده
چکیده
Neural encoding and decoding focus on the question: " What does the response of a neuron tell us about a stimulus ". In this chapter we consider a related but different question: " How much does the neural response tell us about a stimulus ". The techniques of information theory allow us to answer this question in a quantitative manner. Furthermore, we can use them to ask what forms of neural response are optimal for conveying information about natural stimuli. Information theory is a general framework for quantifying the ability of a coding scheme to convey information. It is assumed that the code involves a number of symbols, and the quantities we consider in this chapter, the entropy and mutual information, depend on the frequencies with which these symbols, or combinations of them, are used. Entropy is a measure of the theoretical capacity of a code to convey information. Mutual information measures how much of that capacity is actually used when the code is applied to describe a particular set of data. In neuroscience applications, the symbols we consider are neuronal responses , and the data sets they describe are stimulus characteristics. In the most complete analyses, which are considered at the end of the chapter, the neuronal response is characterized by a list of action potential firing times. The symbols being analyzed in this case are sequences of action potentials. Computing the entropy and mutual information for spike sequences can be difficult because the frequency of occurrence of many different spike sequences must be determined. This typically requires a large amount of data. For this reason, many information theory analyses use simplified descriptions of the response of a neuron that reduce the number of possible 'symbols' (i.e. responses) that need to be considered. We discuss cases in which the symbols consist of responses with different numbers of action
منابع مشابه
Clustering of a Number of Genes Affecting in Milk Production using Information Theory and Mutual Information
Information theory is a branch of mathematics. Information theory is used in genetic and bioinformatics analyses and can be used for many analyses related to the biological structures and sequences. Bio-computational grouping of genes facilitates genetic analysis, sequencing and structural-based analyses. In this study, after retrieving gene and exon DNA sequences affecting milk yield in dairy ...
متن کاملMutual information is copula entropy
In information theory, mutual information (MI) is a difference concept with entropy.[1] In this paper, we prove with copula [2] that they are essentially same – mutual information is also a kind of entropy, called copula entropy. Based on this insightful result, We propose a simple method for estimating mutual information. Copula is a theory on dependence and measurement of association.[2] Skla...
متن کاملQuantum Information Chapter 10. Quantum Shannon Theory
Contents 10 Quantum Shannon Theory 1 10.1 Shannon for Dummies 2 10.1.1 Shannon entropy and data compression 2 10.1.2 Joint typicality, conditional entropy, and mutual information 6 10.1.3 Distributed source coding 8 10.1.4 The noisy channel coding theorem 9 10.2 Von Neumann Entropy 16 10.2.1 Mathematical properties of H(ρ) 18 10.2.2 Mixing, measurement, and entropy 20 10.2.3 Strong subadditivit...
متن کاملOn Classification of Bivariate Distributions Based on Mutual Information
Among all measures of independence between random variables, mutual information is the only one that is based on information theory. Mutual information takes into account of all kinds of dependencies between variables, i.e., both the linear and non-linear dependencies. In this paper we have classified some well-known bivariate distributions into two classes of distributions based on their mutua...
متن کاملInformation Theory and Decision Tree
3 Basics of information theory 3 3.1 Entropy and uncertainty . . . . . . . . . . . . . . . . . . . . . . . 4 3.2 Joint and conditional entropy . . . . . . . . . . . . . . . . . . . . 4 3.3 Mutual information and relative entropy . . . . . . . . . . . . . . 5 3.4 Some inequalities . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 3.5 Entropy of discrete distributions . . . . . . . . . . . ...
متن کامل