Consistency Analysis of an Empirical Minimum Error Entropy Algorithm

نویسندگان

  • Jun Fan
  • Ting Hu
  • Qiang Wu
  • Ding-Xuan Zhou
چکیده

In this paper we study the consistency of an empirical minimum error entropy (MEE) algorithm in a regression setting. We introduce two types of consistency. The error entropy consistency, which requires the error entropy of the learned function to approximate the minimum error entropy, is shown to be always true if the bandwidth parameter tends to 0 at an appropriate rate. The regression consistency, which requires the learned function to approximate the regression function, however, is a complicated issue. We prove that the error entropy consistency implies the regression consistency for homoskedastic models where the noise is independent of the input variable. But for heteroskedastic models, a counterexample is used to show that the two types of consistency do not coincide. A surprising result is that the regression consistency is always true, provided that the bandwidth parameter tends to infinity at an appropriate rate. Regression consistency of two classes of special models is shown to hold with fixed bandwidth parameter, which further illustrates the complexity of regression consistency of MEE. Fourier transform plays crucial roles in our analysis. ∗ The work described in this paper is supported by National Natural Science Foundation of China under Grants (No. 11201079, 11201348, and 11101403) and by a grant from the Research Grants Council of Hong Kong [Project No. CityU 104012]. Jun Fan ([email protected]) is with the Department of Statistics, University of Wisconsin-Madison, 1300 University Avenue Madison, WI 53706 USA. Ting Hu ([email protected]) is with School of Mathematics and Statistics, Wuhan University, Wuhan 430072, China. Qiang Wu ([email protected]) is with Department of Mathematical Sciences, Middle Tennessee State University, Murfreesboro, TN 37132, USA. Ding-Xuan Zhou ([email protected]) are with Department of Mathematics, City University of Hong Kong, Kowloon, Hong Kong, China. Corresponding author. Tel: +1 615 898 2053; Fax: +1 615 898 5422; Email: [email protected]

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning theory approach to minimum error entropy criterion

We consider the minimum error entropy (MEE) criterion and an empirical risk minimization learning algorithm when an approximation of Rényi’s entropy (of order 2) by Parzen windowing is minimized. This learning algorithm involves a Parzen windowing scaling parameter. We present a learning theory approach for this MEE algorithm in a regression setting when the scaling parameter is large. Consiste...

متن کامل

Optimization of Finned-Tube Heat Exchanger with Minimizing the Entropy Production rate

A compact fin-tube heat exchanger is used to transfer current fluid heat inside the tubes into the air outside. In this study, entropy production and optimized Reynolds number for finned-tube heat exchangers based on the minimum entropy production have been investigated. As a result, the total entropy of compact heat exchangers, which is the summation of the production rate of fluid entropy ins...

متن کامل

Error-Based and Entropy-Based Discretization of Continuous Features

We present a comparison of error-based and entropybased methods for discretization of continuous features. Our study includes both an extensive empirical comparison as well as an analysis of scenarios where error minimization may be an inappropriate discretization criterion. We present a discretization method based on the C4.5 decision tree algorithm and compare it to an existing entropy-based ...

متن کامل

Normalized Minimum Error Entropy Algorithm with Recursive Power Estimation

Abstract: The minimum error entropy (MEE) algorithm is known to be superior in signal processing applications under impulsive noise. In this paper, based on the analysis of behavior of the optimum weight and the properties of robustness against impulsive noise, a normalized version of the MEE algorithm is proposed. The step size of the MEE algorithm is normalized with the power of input entropy...

متن کامل

Theoretical links between universal and Bayesian compressed sensing algorithms

Quantized maximum a posteriori (Q-MAP) is a recently-proposed Bayesian compressed sensing algorithm that, given the source distribution, recovers X from its linear measurements Y m = AX, where A ∈ R denotes the known measurement matrix. On the other hand, Lagrangian minimum entropy pursuit (L-MEP) is a universal compressed sensing algorithm that aims at recovering X from its linear measurements...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1412.5272  شماره 

صفحات  -

تاریخ انتشار 2014