An operational characterization of mutual information in algorithmic information theory
نویسندگان
چکیده
We show that the mutual information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties, one having x and the complexity profile of the pair and the other one having y and the complexity profile of the pair, can establish via a probabilistic protocol with interaction on a public channel. For l > 2, the longest shared secret that can be established from a tuple of strings (x1, ..., xl) by l parties, each one having one component of the tuple and the complexity profile of the tuple, is equal, up to logarithmic precision, to the complexity of the tuple minus the minimum communication necessary for distributing the tuple to all parties. We establish the communication complexity of secret key agreement protocols that produce a secret key of maximal length, for protocols with public randomness. We also show that if the communication complexity drops below the established threshold then only very short secret keys can be obtained.
منابع مشابه
Probabilistic Sufficiency and Algorithmic Sufficiency from the point of view of Information Theory
Given the importance of Markov chains in information theory, the definition of conditional probability for these random processes can also be defined in terms of mutual information. In this paper, the relationship between the concept of sufficiency and Markov chains from the perspective of information theory and the relationship between probabilistic sufficiency and algorithmic sufficien...
متن کاملAn Operational Characterization of the Notion of Probability by Algorithmic Randomness
The notion of probability plays an important role in almost all areas of science. In modern mathematics, however, probability theory means nothing other than measure theory, and an operational characterization of the notion of probability is not established yet. In this paper, based on the toolkit of algorithmic randomness we present an operational characterization of the notion of probability ...
متن کاملAn Operational Characterization of the Notion of Probability by Algorithmic Randomness and Its Application to Cryptography
The notion of probability plays an important role in almost all areas of science, including cryptography. In modern mathematics, however, probability theory means nothing other than measure theory, and the operational characterization of the notion of probability is not established yet. In this paper, based on the toolkit of algorithmic randomness we present an operational characterization of t...
متن کاملOn Classification of Bivariate Distributions Based on Mutual Information
Among all measures of independence between random variables, mutual information is the only one that is based on information theory. Mutual information takes into account of all kinds of dependencies between variables, i.e., both the linear and non-linear dependencies. In this paper we have classified some well-known bivariate distributions into two classes of distributions based on their mutua...
متن کاملClustering of a Number of Genes Affecting in Milk Production using Information Theory and Mutual Information
Information theory is a branch of mathematics. Information theory is used in genetic and bioinformatics analyses and can be used for many analyses related to the biological structures and sequences. Bio-computational grouping of genes facilitates genetic analysis, sequencing and structural-based analyses. In this study, after retrieving gene and exon DNA sequences affecting milk yield in dairy ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Electronic Colloquium on Computational Complexity (ECCC)
دوره 25 شماره
صفحات -
تاریخ انتشار 2018