نتایج جستجو برای: entropy code
تعداد نتایج: 231646 فیلتر نتایج به سال:
This document specifies an Internet standards track protocol for the Internet community, and requests discussion and suggestions for improvements. Please refer to the current edition of the "Internet Official Protocol Standards" (STD 1) for the standardization state and status of this protocol. Distribution of this memo is unlimited. Abstract This document describes a Fully-Specified Forward Er...
A new measure called average code word length of order is defined and its relationship with Renyi's entropy of order is discussed. Using some coding theorems are proved under the condition Refer ences
Robust, static disassembly is an important part of achieving high coverage for many binary code analyses, such as reverse engineering, malware analysis, reference monitor in-lining, and software fault isolation. However, one of the major difficulties current disassemblers face is differentiating code from data when they are interleaved. This paper presents a machine learning-based disassembly a...
A simple statistical block code in combination with the LZW-based compression utilities gzip and compress has been found to increase by a significant amount the level of compression possible for the proteins encoded in Haemophilus Influenzae (hi), the first fully sequenced genome. The method yields an entropy of 3.665 bits per symbol (bps), which is 0.657 bps below the maximum of 4.322 bps. Thi...
The highly peaked, wide-tailed pdfs that are encountered in many image coding algorithms are often modeled using the family of generalized Gaussian (GG) pdfs. We study entropy coding of quantized GG sources using preex codes that are highly structured, and which therefore involve low computational complexity to utilize. We provide bounds for the redundancy associated with applying these codes t...
the main objective in sampling is to select a sample from a population in order to estimate some unknown population parameter, usually a total or a mean of some interesting variable. a simple way to take a sample of size n is to let all the possible samples have the same probability of being selected. this is called simple random sampling and then all units have the same probability of being ch...
A general multi-terminal source code and a general multi-terminal channel code are presented. Constrainedrandom-number generators with sparse matrices, are used in the construction of both encoders and decoders. Achievable regions for source coding and channel coding are derived in terms of entropy functions, where the capacity region for channel coding provides an alternative to the region of ...
In this correspondence we provide new bounds on the expected length L of a binary one-to-one code for a discrete random variable X with entropy H. We prove that L H ? log(H + 1) ? H log(1 + 1=H). This bound improves on previous results. Furthermore, we provide upper bounds on the expected length of the best code as function of H and the most likely source letter probability.
Shannon (1948) has shown that a source (U, P, U) with output U satisfying Prob (U = u) = Pu, can be encoded in a prefix code C = {cu : u ∈ U} ⊂ {0, 1} * such that for the entropy H(P) = u∈U −pu log pu ≤ pu||cu|| ≤ H(P) + 1, where ||cu|| is the length of cu. We use a prefix code C for another purpose, namely noiseless identification , that is every user who wants to know whether a u (u ∈ U) of h...
This study presents the investigation of the local entropy generation in compressible flow through a suddenly expanding pipe. Air is used as fluid. The air enters into the pipe with a turbulent profile using 1/7 th power law. The simulations are extended to include different expansion ratios reduced gradually from 5 to 1. To determine the effects of the mass flux, φ′ ′ , the ambient heat transf...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید