Analytical Bounds between Entropy and Error Probability in Binary Classifications

نویسندگان

  • Bao-Gang Hu
  • Hong-Jie Xing
چکیده

The existing upper and lower bounds between entropy and error probability are mostly derived from the inequality of the entropy relations, which could introduce approximations into the analysis. We derive analytical bounds based on the closed-form solutions of conditional entropy without involving any approximation. Two basic types of classification errors are investigated in the context of binary classification problems, namely, Bayesian and non-Bayesian errors. We theoretically confirm that Fano’s lower bound is an exact lower bound for any types of classifier in a relation diagram of “error probability vs. conditional entropy”. The analytical upper bounds are achieved with respect to the minimum prior probability, which are tighter than Kovalevskij’s upper bound.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A New Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications

The existing upper and lower bounds between entropy and error are mostly derived through an inequality means without linking to joint distributions. In fact, from either theoretical or application viewpoint, there exists a need to achieve a complete set of interpretations to the bounds in relation to joint distributions. For this reason, in this work we propose a new approach of deriving the bo...

متن کامل

An Optimization Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications

In this work, we propose a new approach of deriving the bounds between entropy and error from a joint distribution through an optimization means. The specific case study is given on binary classifications. Two basic types of classification errors are investigated, namely, the Bayesian and non-Bayesian errors. The consideration of non-Bayesian errors is due to the facts that most classifiers res...

متن کامل

Taylor Expansion for the Entropy Rate of Hidden Markov Chains

We study the entropy rate of a hidden Markov process, defined by observing the output of a symmetric channel whose input is a first order Markov process. Although this definition is very simple, obtaining the exact amount of entropy rate in calculation is an open problem. We introduce some probability matrices based on Markov chain's and channel's parameters. Then, we try to obtain an estimate ...

متن کامل

On Channel Capacity, Uncoded Error Probability, ML-Detection and Spin Glasses

* denoting the binary entropy function and denoting the uncoded error probability. The goal of this paper is to extend such relationship to channels where the connection between uncoded error probability and channel capacity is less obvious. A first step into that direction is the following lemma proven in [1]: Lemma 1: Consider a memoryless weakly symmetric1 channel with binary input and outpu...

متن کامل

Some Remarks on Classical and Classical-Quantum Sphere Packing Bounds: Rényi vs. Kullback-Leibler

We review the use of binary hypothesis testing for the derivation of the sphere packing bound in channel coding, pointing out a key difference between the classical and the classical-quantum setting. In the first case, two ways of using the binary hypothesis testing are known, which lead to the same bound written in different analytical expressions. The first method historically compares output...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1205.6602  شماره 

صفحات  -

تاریخ انتشار 2012