نتایج جستجو برای: huffman code

تعداد نتایج: 168910  

Journal: :CoRR 2007
Michael B. Baer

New lower and upper bounds are obtained for the compression of optimal binary prefix codes according to various nonlinear codeword length objectives. Like the coding bounds for Huffman coding — which concern the traditional linear code objective of minimizing average codeword length — these are in terms of a form of entropy and the probability of the most probable input symbol. As in Huffman co...

Journal: :IEEE Trans. Information Theory 1997
Tamás Linder Vahid Tarokh Kenneth Zeger

It is proven that for every random variable with a countably infinite set of outcomes and finite entropy there exists an optimal prefix code which can be constructed from Huffman codes for truncated versions of the random variable, and that the average lengths of any sequence of Huffman codes for the truncated versions converge to that of the optimal code. Also, it is shown that every optimal i...

Journal: :IEEE Trans. Information Theory 1997
Roberto De Prisco Alfredo De Santis

It is proven that for every random variable with a countably infinite set of outcomes and finite entropy there exists an optimal prefix code which can be constructed from Huffman codes for truncated versions of the random variable, and that the average lengths of any sequence of Huffman codes for the truncated versions converge to that of the optimal code. Also, it is shown that every optimal i...

1999
Ruy Luiz Milidiú Eduardo Sany Laber Artur Alves Pessoa

Given an tilphabet C = (al! . . . , a,) and a corresponding list of weights [WI, . . . , w,], a Huffman code for this alphabet is a prefix code that minimizes the weighted length of a code string, defined to be Cr.., wili, where li is the length of the code assigned to ai. A. Huffman code can be generated in O(n log n) time for an unsorted list of weights alld in Cl(n) time if the weights are a...

Abstract: In this paper, we fit a function on probability density curve representing an information stream using artificial neural network . This methodology result is a specific function which represent a memorize able probability density curve . we then use the resulting function for information compression by Huffman algorithm . the difference between the proposed me then with the general me...

Journal: :International Journal of Computer Applications 2014

Journal: :CoRR 2012
Utpal Nandi J. K. Mandal

A loss-less compression technique is proposed which uses a variable length Region formation technique to divide the input file into a number of variable length regions. Huffman codes are obtained for entire file after formation of regions. Symbols of each region are compressed one by one. Comparisons are made among proposed technique, Region Based Huffman compression technique and classical Huf...

2007
Marek Biskup

In compressed data a single bit error propagates because of the corruption of the decoder’s state. This work is a study of error resilience in compressed data and, in particular, of the recovery of as much data as possible after a bit error. It is focused on Huffman codes. In a message encoded with a Huffman code a bit error causes the decoder to lose synchronization with the coder. The error p...

2005
Soheil Mohajer Payam Pakzad Ali Kakhbod

Consider a discrete finite source with N symbols, and with the probability distribution p := (u1, u2, . . . , uN). It is well-known that the Huffman encoding algorithm [1] provides an optimal prefix code for this source. A D-ary Huffman code is usually represented using a D-ary tree T , whose leaves correspond to the source symbols; The D edges emanating from each intermediate node of T are lab...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید