Entropy of Weight Distributions of Small-Bias Spaces and Pseudobinomiality

نویسنده

  • Louay Bazzi
چکیده

A classical bound in information theory asserts that small L1-distance between probability distributions implies small difference in Shannon entropy, but the converse need not be true. We show that if a probability distribution on {0, 1} has small-bias, then the converse holds for its weight distribution in the proximity of the binomial distribution. Namely, we argue that if a probability distribution μ on {0, 1} is δ-biased, then ‖μ− binn ‖1 ≤ (2 ln 2)(nδ +H(binn)− H(μ)), where μ is the weight distribution of μ and binn is the binomial distribution on {0, . . . , n}. The key result behind this bound is a lemma which asserts the non-positivity of all the Fourier coefficients of the log-binomial function L : {0, 1} → R given by L(x) = lg binn(|x|). The original question which motivated the work reported in this paper is the problem of explicitly constructing a small subset of {0, 1} which is -pseudobinomial in the sense that the weight distribution of each of its restrictions and translations is -close to the binomial distribution. We study the notion of pseudobinomiality and we conclude that, for spaces with n−Θ(1)-small bias, the pseudobinomiality error in the L1-sense is equivalent to that in the entropy-difference-sense, in the n−Θ(1)-error regime. We also study the notion of average case pseudobinomiality, and we show that for spaces with n−Θ(1)-small bias, the average entropy of the weight distribution of a random translation of the space is n−Θ(1)-close to the entropy of the binomial distribution. We discuss resulting questions on the pseudobinomiality of sums of independent small-bias spaces. Using the above results, we show that the following conjectures are equivalent: (1) For all independent δ-biased random vectors X,Y ∈ {0, 1}, the F2-sum X + Y is O((nδ))pseudobinomial; (2) For all independent δ-biased random vectors X,Y ∈ {0, 1}, the entropy of the weight of the sum H(|X + Y |) ≥ min{H(|X|), H(|Y |)} −O((nδ)). ∗To appear in Chicago Journal of Theoretical Computer Science, 2015 †A conference version of this paper appeared in COCOON 2015, LNCS 9198 proceedings, pages 495-506. ‡Department of Electrical and Computer Engineering, American University of Beirut, Beirut, Lebanon. E-mail: [email protected].

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Entropy Numbers in Weighted Function Spaces and Eigenvalue Distributions of Some Degenerate Pseudodiierential Operators I

In this paper we study weighted function spaces of type B s p;q (IR n ; %(x)) and F s p;q (IR n ; %(x)), where %(x) is a weight function of at most polynomial growth. Of special interest are the weight functions %(x) = (1 + jxj 2) =2 with 2 IR. The main result deals with estimates for the entropy numbers of compact embeddings between spaces of this type.

متن کامل

Shannon entropy in generalized order statistics from Pareto-type distributions

In this paper, we derive the exact analytical expressions for the Shannon entropy of generalized orderstatistics from Pareto-type and related distributions.

متن کامل

Determination of Maximum Bayesian Entropy Probability Distribution

In this paper, we consider the determination methods of maximum entropy multivariate distributions with given prior under the constraints, that the marginal distributions or the marginals and covariance matrix are prescribed. Next, some numerical solutions are considered for the cases of unavailable closed form of solutions. Finally, these methods are illustrated via some numerical examples.

متن کامل

A Note on the Bivariate Maximum Entropy Modeling

Let X=(X1 ,X2 ) be a continuous random vector. Under the assumption that the marginal distributions of X1 and X2 are given, we develop models for vector X when there is partial information about the dependence structure between X1  and X2. The models which are obtained based on well-known Principle of Maximum Entropy are called the maximum entropy (ME) mo...

متن کامل

On the Monotone Behavior of Time Dependent Entropy of Order alpha

In this paper we study some monotone behavior of the residual (past) entropy of order . We prove that, under some relation between the hazard rates (reversed hazard rates) of two distributions functions F and G, when the residual (past) entropy of order of F is decreasing (increasing) then the residual (past) entropy of G is decreasing (increasing). Using this, several conclusions regarding mo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014