Quantized Minimum Error Entropy Criterion
نویسندگان
چکیده
Comparing with traditional learning criteria, such as mean square error (MSE), the minimum error entropy (MEE) criterion is superior in nonlinear and non-Gaussian signal processing and machine learning. The argument of the logarithm in Renyis entropy estimator, called information potential (IP), is a popular MEE cost in information theoretic learning (ITL). The computational complexity of IP is however quadratic in terms of sample number due to double summation. This creates computational bottlenecks especially for large-scale datasets. To address this problem, in this work we propose an efficient quantization approach to reduce the computational burden of IP, which decreases the complexity from O ( N ) to O (MN) with M ≪ N . The new learning criterion is called the quantized MEE (QMEE). Some basic properties of QMEE are presented. Illustrative examples are provided to verify the excellent performance of QMEE.
منابع مشابه
Near-lossless image compression: minimum-entropy, constrained-error DPCM
A near-lossless image compression scheme is presented. It is essentially a differential pulse code modulation (DPCM) system with a mechanism incorporated to minimize the entropy of the quantized prediction error sequence. With a "near-lossless" criterion of no more than a d gray-level error for each pixel, where d is a small nonnegative integer, trellises describing all allowable quantized pred...
متن کاملOn the Smoothed Minimum Error Entropy Criterion
Recent studies suggest that the minimum error entropy (MEE) criterion can outperform the traditional mean square error criterion in supervised machine learning, especially in nonlinear and non-Gaussian situations. In practice, however, one has to estimate the error entropy from the samples since in general the analytical evaluation of error entropy is not possible. By the Parzen windowing appro...
متن کاملAn Extended Result on the Optimal Estimation Under the Minimum Error Entropy Criterion
The minimum error entropy (MEE) criterion has been successfully used in fields such as parameter estimation, system identification and the supervised machine learning. There is in general no explicit expression for the optimal MEE estimate unless some constraints on the conditional distribution are imposed. A recent paper has proved that if the conditional density is conditionally symmetric and...
متن کاملA Quantized Kernel Learning Algorithm Using a Minimum Kernel Risk-Sensitive Loss Criterion and Bilateral Gradient Technique
Recently, inspired by correntropy, kernel risk-sensitive loss (KRSL) has emerged as a novel nonlinear similarity measure defined in kernel space, which achieves a better computing performance. After applying the KRSL to adaptive filtering, the corresponding minimum kernel risk-sensitive loss (MKRSL) algorithm has been developed accordingly. However, MKRSL as a traditional kernel adaptive filter...
متن کاملImproved Minimum Entropy Filtering for Continuous Nonlinear Non-Gaussian Systems Using a Generalized Density Evolution Equation
This paper investigates the filtering problem for multivariate continuous nonlinear non-Gaussian systems based on an improved minimum error entropy (MEE) criterion. The system is described by a set of nonlinear continuous equations with non-Gaussian system noises and measurement noises. The recently developed generalized density evolution equation is utilized to formulate the joint probability ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1710.04089 شماره
صفحات -
تاریخ انتشار 2017