Fixed-rate universal lossy source coding and rates of convergence for memoryless sources
نویسندگان
چکیده
AbstructA fixed-rate universal lossy coding scheme is introduced for independent and identically distributed (i.i.d.) sources. It is shown for finite alphabet sources and arbitrary single letter distortion measures that as the sample size T I grows the expected distortion obtained using this universal scheme converges to Shannon’s distortion rate function D ( R ) at a rate 0 (log , I / , , ) . The scheme can be extended to universal quantization of real i.i.d sources subject to a squared error criterion. It is shown in this case that the per-letter distortion converges to D ( R ) at a rate O( Jw) both in expectation and almost surely for any real-valued bounded i.i.d. source.
منابع مشابه
Rates of convergence in the source coding theorem, in empirical quantizer design, and in universal lossy source coding
Abstruct-Rate of convergence results are established for vector quantization. Convergence rates are given for an increasing vector dimension and/or an increasing training set size. In particular, the following results are shown for memoryless realvalued sources with bounded support at transmission rate R: (1) If a vector quantizer with fixed dimension k is designed to minimize the empirical mea...
متن کاملRates of Convergence in the Source Coding Theorem , in Empirical Quantizer Design , and in Universal Lossy
Rate of convergence results are established for vector quantization. Convergence rates are given for a n increasing vector dimension a n d / o r a n increasing training set size. I n particular, the following results are shown for memoryless realvalued sources with bounded support a t transmission rate R: (1) If a vector quantizer with fixed dimension k is designed to minimize the empirical mea...
متن کاملCritical behavior in lossy source coding
The following critical phenomenon was recently discovered. When a memoryless source is compressed using a variable-length fixed-distortion code, the fastest convergence rate of the (pointwise) compression ratio to R(D) is either O( √ n) or O(logn). We show it is always O( √ n), except for discrete, uniformly distributed sources. Keywords—Redundancy, rate-distortion theory, lossy data compression
متن کاملPointwise redundancy in lossy data compression and universal lossy data compression
We characterize the achievable pointwise redundancy rates for lossy data compression at a fixed distortion level. “Pointwise redundancy” refers to the difference between the description length achieved by an nth-order block code and the optimal nR(D) bits. For memoryless sources, we show that the best achievable redundancy rate is of order O( √ n) in probability. This follows from a second-orde...
متن کاملSecond-Order Coding Rates for Conditional Rate-Distortion
This paper characterizes the second-order coding rates for lossy source coding with side information available at both the encoder and the decoder. We first provide non-asymptotic bounds for this problem and then specialize the non-asymptotic bounds for three different scenarios: discrete memoryless sources, Gaussian sources, and Markov sources. We obtain the second-order coding rates for these...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- IEEE Trans. Information Theory
دوره 41 شماره
صفحات -
تاریخ انتشار 1995