نتایج جستجو برای: conditional entropy

تعداد نتایج: 122651  

Journal: :IEEE Trans. Information Theory 1981
Jan M. Van Campenhout Thomas M. Cover

It is well-known that maximum entropy distributions, subject to appropriate moment constraints, arise in physics and mathematics. In an attempt to find a physical reason for the appearance of maximum entropy distributions, the following theorem is offered. The conditional distribution of X, given the empirical observation (1 /n)X:, ,/I ( X,) = (Y, where X, , X2, . are independent identically di...

1995
Diego P. de Garrido William A. Pearlman

{The performance of optimum vector quantizers subject to a conditional entropy constraint is studied in this paper. This new class of vector quantizers was originally suggested by Chou and Lookabaugh. A locally optimal design of this kind of vector quantizer can be accomplished through a generalization of the well known entropy-constrained vector quantizer (ECVQ) algorithm. This generalization ...

Journal: :Entropy 2017
Bo Shi Yudong Zhang Chaochao Yuan Shuihua Wang Peng Li

Entropy measures have been extensively used to assess heart rate variability (HRV), a noninvasive marker of cardiovascular autonomic regulation. It is yet to be elucidated whether those entropy measures can sensitively respond to changes of autonomic balance and whether the responses, if there are any, are consistent across different entropy measures. Sixteen healthy subjects were enrolled in t...

2008
Sumiyoshi Abe

Based on the form invariance of the structures given by Khinchin’s axiomatic foundations of information theory and the pseudoadditivity of the Tsallis entropy indexed by q , the concept of conditional entropy is generalized to the case of nonadditive (nonextensive) composite systems. The proposed nonadditive conditional entropy is classically nonnegative but can be negative in the quantum conte...

2004
Yu Zheng Gary Geunbae Lee Byeongchang Kim

We model Mandarin phrase break prediction as a classification problem with three level prosodic structures and apply conditional maximum entropy classification to this problem. We acquire multiple levels of linguistic knowledge from an annotated corpus to become well-integrated features for maximum entropy framework. Five kinds of features were used to represent various linguistic constraints i...

Journal: :IACR Cryptology ePrint Archive 2012
Benjamin Fuller Leonid Reyzin

We investigate how information leakage reduces computational entropy of a random variable X. Recall that HILL and metric computational entropy are parameterized by quality (how distinguishable is X from a variable Z that has true entropy) and quantity (how much true entropy is there in Z). We prove an intuitively natural result: conditioning on an event of probability p reduces the quality of m...

2011
Benjamin Fuller Leonid Reyzin

We investigate how information leakage reduces computational entropy of a random variable X. Recall that HILL and metric computational entropy are parameterized by quality (how distinguishable is X from a variable Z that has true entropy) and quantity (how much true entropy is there in Z). We prove an intuitively natural result: conditioning on an event of probability p reduces the quality of m...

2015
Alberto Porta Luca Faes Giandomenico Nollo Vlasta Bari Andrea Marchi Beatrice De Maria Anielle C. M. Takahashi Aparecida M. Catai Vladimir E. Bondarenko

Self-entropy (SE) and transfer entropy (TE) are widely utilized in biomedical signal processing to assess the information stored into a system and transferred from a source to a destination respectively. The study proposes a more specific definition of the SE, namely the conditional SE (CSE), and a more flexible definition of the TE based on joint TE (JTE), namely the conditional JTE (CJTE), fo...

Journal: :Int. J. General Systems 2002
Jiye Liang Kwai-Sang Chin Chuangyin Dang Richard C. M. Yam

Based on the complement behavior of information gain, a new definition of information entropy is proposed along with its justification in rough set theory. Some properties of this definition imply those of Shannon’s entropy. Based on the new information entropy, conditional entropy and mutual information are then introduced and applied to knowledge bases. The new information entropy is proved t...

Journal: :Physical review letters 2015
Fernando G S L Brandão Aram W Harrow Jonathan Oppenheim Sergii Strelchuk

We give two strengthenings of an inequality for the quantum conditional mutual information of a tripartite quantum state recently proved by Fawzi and Renner, connecting it with the ability to reconstruct the state from its bipartite reductions. Namely, we show that the conditional mutual information is an upper bound on the regularized relative entropy distance between the quantum state and its...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید