Information theoretic inequalities

نویسندگان

  • Amir Dembo
  • Thomas M. Cover
  • Joy A. Thomas
چکیده

The role of inequalities in information theory is reviewed and the relationship of these inequalities to inequalities in other branches of mathematics is developed. Index Terms -Information inequalities, entropy power, Fisher information, uncertainty principles. I. PREFACE:~NEQUALITIES ININFORMATIONTHEORY I NEQUALITIES in information theory have been driven by a desire to solve communication theoretic problems. To solve such problems, especially to prove converses for channel capacity theorems, the algebra of information was developed and chain rules for entropy and mutual information were derived. Fano’s inequality, for example, bounds the probability of error by the conditional entropy. Some deeper inequalities were developed as early as Shannon’s 1948 paper. For example, Shannon stated the entropy power inequality in order to bound the capacity of non-Gaussian additive noise channels. Information theory is no longer restricted to the domain of communication theory. For this reason it is interesting to consider the set of known inequalities in information theory and search for other inequalities of the same type. Thus motivated, we will look for natural families of information theoretic inequalities. For example, the entropy power inequality, which says that the entropy of the sum of two independent random vectors is no less than the entropy of the sum of their independent normal counterparts, has a strong formal resemblance to the Brunn M inkowski inequality, which says that the volume of the set sum of two sets is greater than or equal to the volume of the set sum of their spherical counterparts. Similarly, since the exponentiated entropy is a measure of volume it makes sense to consider the surface area of the volume of the typical set associated with a given probability density. Happily, this turns Manuscript received February 1, 1991. This work was supported in part by the National Science Foundation under Grant NCR-89-14538 and in part by JSEP Contract DAAL 03-91-C-0010. A Dembo was supported in part by the SDIO/IST, managed by the Army Research Office under Contract DAAL 03-90-G-0108 and in part by the Air Force Office of Scientific Research, Air Force Systems Command under Contract AF88-0327. Sections IIIand IV are based on material presented at the IEEE/CAM Workshop on Information Theory, 1989. A. Dembo is with the Statistics Department, Stanford University, Stanford, CA 94305. T. M. Cover is with the Information Systems Laboratory, Stanford University, Stanford, CA 94305. J. Thomas was with Stanford University. He is now with the IBM T. J. Watson Research Center, Yorktown Heights, NY 10598. IEEE Log Number 9103368. out to be another information quantity, the Fisher information. A large number of inequalities can be derived from a strengthened Young’s inequality. These inequalities include the entropy power inequality, the Brunn M inkowski inequality and the Heisepberg uncertainty inequality. These inequalities are extreme points of the set of inequalities derivable from a central idea. Logically independent derivations of these inequalities exist and are based on Fisher information inequalities such as the Cramer-Rao inequality. Turning our attention to simple inequalities for differential entropy, we apply them to the standard multivariate normal to furnish new and simpler proofs of the major determinant inequalities in classical mathematics. In particular Hadamard’s inequality, Ky Fan’s inequality and others can be derived thjs way. Indeed we find some new matrix inequalities by this method. Moreover the entropy power inequality, when specialized to matrices, turns out to yield M inkowski’s determinant inequality, yet another tangency with the M inkowski of Brunn-Minkowski. In the process of finding determinant inequalities we derive some new differential entropy inequalities. We restate one of them as follows. Suppose one is looking at ocean waves at a certain subset of points. Then the average entropy per sample of a random subset of samples can be shown to increase as the number of sampling points increases. Gn the other hand, the per sample conditional entropy of the samples, conditioned on the values of the remaining samples, monotonically decreases. Once again using these entropy inequalities on the standard multivariate normal leads to associated matrix inequalities and in particular to an extension of the sequence of inequalities found by Hadamard and Szasz. By turning our attention from the historically necessary inequalities to the natural set of inequalities suggested by information theory itself, we find, full circle, that these inequalities turn out to be useful as well. They improve determinant inequalities, lead to overlooked inequalities for the entropy rate of random subsets and demonstrate the unity between physics, mathematics, information theory and statistics (through unified proofs of the Heisenberg, entropy power, Fisher information and Brunn-Minkowski inequalities). The next section is devoted to differential entropy inequalities for random subsets of samples. These inequalities when specialized to multivariate normal vari001%9448,‘91 $01.00 01991 IEEE ables provide the determinant inequalities presented in Section V. Section III focuses on the entropy power inequality (including the related Brunn-Minkowski, Young’s and Fisher information inequalities) while Section IV deals with various uncertainty principles and their interrelations. 1502 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 37, NO. 6, NOVEMBER 1991 Lemma 2: If (X,Y) have a joint density, then h(XIY) = h(X,Y)h(Y).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Information-theoretic inequalities for contoured probability distributions

We show that for a special class of probability distributions that we call contoured distributions, information theoretic invariants and inequalities are equivalent to geometric invariants and inequalities of bodies in Euclidean space associated with the distributions. Using this, we obtain characterizations of contoured distributions with extremal Shannon and Renyi entropy. We also obtain a ne...

متن کامل

Recent Progresses in Characterising Information Inequalities

In this paper, we present a revision on some of the recent progresses made in characterising and understanding information inequalities, which are the fundamental physical laws in communications and compression. We will begin with the introduction of a geometric framework for information inequalities, followed by the first non-Shannon inequality proved by Zhang et al. in 1998 [1]. The discovery...

متن کامل

Extracting analytic proofs from numerically solved Shannon-type Inequalities

A class of information inequalities, called Shannon-type inequalities (STIs), can be proven via a computer software called ITIP [1]. In previous work [2], we have shown how this technique can be utilized to Fourier-Motzkin elimination algorithm for Information Theoretic Inequalities. Here, we provide an algorithm for extracting analytic proofs of information inequalities. Shannon-type inequalit...

متن کامل

Generating Mathematical Inequalities via Fuzzy Information Measures

One of the important application areas of information theoretic measures is the development of some inequalities frequently used in information theory. The present communication deals with the development of such important inequalities through the maximization of entropy measures, especially while dealing with fuzzy distributions.

متن کامل

Concentration of Measure Inequalities in Information Theory, Communications, and Coding Concentration of Measure Inequalities in Information Theory, Communications, and Coding

During the last two decades, concentration inequalities have been the subject of exciting developments in various areas, including convex geometry, functional analysis, statistical physics, high-dimensional statistics, pure and applied probability theory (e.g., concentration of measure phenomena in random graphs, random matrices, and percolation), information theory, theoretical computer scienc...

متن کامل

Concentration of Measure Inequalities and Their Communication and Information-Theoretic Applications

During the last two decades, concentration of measure has been a subject of various exciting developments in convex geometry, functional analysis, statistical physics, high-dimensional statistics, probability theory, information theory, communications and coding theory, computer science, and learning theory. One common theme which emerges in these fields is probabilistic stability: complicated,...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 37  شماره 

صفحات  -

تاریخ انتشار 1991