نتایج جستجو برای: entropy code

تعداد نتایج: 231646  

Journal: :IEEE Trans. Information Theory 1994
László Györfi Istvan Pali Edward C. van der Meulen

We show that a discrete infinite distribution with finite entropy cannot be estimated consistently in information divergence. As a corollary we get that there is no universal source code for an infinite source alphabet over the class of all discrete memoryless sources with finite entropy.

2012
Marek Śmieja Jacek Tabor

Rényi entropy dimension describes the rate of growth of coding cost in the process of lossy data compression in the case of exponential dependence between the code length and the cost of coding. In this paper we generalize the Csiszár estimation of the Rényi entropy dimension of the mixture of measures for the case of general probability metric space. This result determines the cost of encoding...

Journal: :Proceedings. IEEE Computer Society Bioinformatics Conference 2003
G. Sampath

A simple statistical block code in combination with the LZW-based compression utilities gzip and compress has been found to increase by a significant amount the level of compression possible for the proteins encoded in Haemophilus influenzae, the first fully sequenced genome. The method yields an entropy value of 3.665 bits per symbol (bps), which is 0.657 bps below the maximum of 4.322 bps and...

Journal: :J. Innovation in Digital Ecosystems 2016
Michael Wojnowicz Glenn Chisholm Matt Wolff Xuan Zhao

Sophisticated malware authors can sneak hidden malicious code into portable executable files, and this code can be hard to detect, especially if encrypted or compressed. However, when an executable file switches between code regimes (e.g., native, encrypted, compressed, text, and padding), there are corresponding shifts in the file’s representation as an entropy signal. In this paper, we develo...

2010
B. T. Maharaj

A new universal noise-robust lossless compression algorithm based on a decremental redundancy approach with Fountain codes is proposed. The binary entropy code is harnessed to compress complex sources with the addition of a preprocessing system in this paper. Both the whole binary entropy range compression performance and the noise-robustness of an existing incremental redundancy Fountain code ...

Journal: :Physical review. C, Nuclear physics 1996
Ochs Heinz

We investigate entropy production for an expanding system of particles and resonances with isospin symmetry { in our case pions and mesons { within the framework of relativistic kinetic theory. A cascade code to simulate the kinetic equations is developed and results for entropy production and particle spectra are presented.

Journal: :Physical review. E 2016
Haiping Huang Taro Toyoizumi

A network of neurons in the central nervous system collectively represents information by its spiking activity states. Typically observed states, i.e., code words, occupy only a limited portion of the state space due to constraints imposed by network interactions. Geometrical organization of code words in the state space, critical for neural information processing, is poorly understood due to i...

Journal: :CoRR 2014
Alessio Meneghetti Massimiliano Sala Alessandro Tomasi

We consider a bound on the bias reduction of a random number generator by processing based on binary linear codes. We introduce a new bound on the total variation distance of the processed output based on the weight distribution of the code generated by the chosen binary matrix. Starting from this result we show a lower bound for the entropy rate of the output of linear binary extractors.

Journal: :CoRR 2018
Igal Sason Sergio Verdú

This paper provides upper and lower bounds on the optimal guessing moments of a random variable taking values on a finite set when side information may be available. These moments quantify the number of guesses required for correctly identifying the unknown object and, similarly to Arikan’s bounds, they are expressed in terms of the Arimoto-Rényi conditional entropy. Although Arikan’s bounds ar...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید