Representation and generalization properties of class-entropy networks
نویسندگان
چکیده
Using conditional class entropy (CCE) as a cost function allows feedforward networks to fully exploit classification-relevant information. CCE-based networks arrange the data space into partitions, which are assigned unambiguous symbols and are labeled by class information. By this labeling mechanism the network can model the empirical data distribution at the local level. Region labeling evolves with the network-training process, which follows a plastic algorithm. The paper proves several theoretical properties about the performance of CCE-based networks, and considers both convergence during training and generalization ability at run-time. In addition, analytical criteria and practical procedures are proposed to enhance the generalization performance of the trained networks. Experiments on artificial and real-world domains confirm the accuracy of this class of networks and witness the validity of the described methods.
منابع مشابه
A Preferred Definition of Conditional Rényi Entropy
The Rényi entropy is a generalization of Shannon entropy to a one-parameter family of entropies. Tsallis entropy too is a generalization of Shannon entropy. The measure for Tsallis entropy is non-logarithmic. After the introduction of Shannon entropy , the conditional Shannon entropy was derived and its properties became known. Also, for Tsallis entropy, the conditional entropy was introduced a...
متن کاملSome properties of the parametric relative operator entropy
The notion of entropy was introduced by Clausius in 1850, and some of the main steps towards the consolidation of the concept were taken by Boltzmann and Gibbs. Since then several extensions and reformulations have been developed in various disciplines with motivations and applications in different subjects, such as statistical mechanics, information theory, and dynamical systems. Fujii and Kam...
متن کاملNonextensive triangle equality and other properties of Tsallis relative-entropy minimization
Kullback–Leibler relative-entropy has unique properties in cases involving distributions resulting from relative-entropy minimization. Tsallis relative-entropy is a one-parameter generalization of Kullback–Leibler relative-entropy in the nonextensive thermostatistics. In this paper, we present the properties of Tsallis relative-entropy minimization and present some differences with the classica...
متن کاملDiagnosis of brain tumor using image processing and determination of its type with RVM neural networks
Typically, the diagnosis of a tumor is done through surgical sampling, which is more precise with existing methods. The difference is that this is an aggressive, time consuming and expensive way. In the statistical method, due to the complexity of the brain tissues and the similarity between the cancerous cells and the natural tissues, even a radiologist or an expert physician may also be in er...
متن کاملA research on classification performance of fuzzy classifiers based on fuzzy set theory
Due to the complexities of objects and the vagueness of the human mind, it has attracted considerable attention from researchers studying fuzzy classification algorithms. In this paper, we propose a concept of fuzzy relative entropy to measure the divergence between two fuzzy sets. Applying fuzzy relative entropy, we prove the conclusion that patterns with high fuzziness are close to the classi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- IEEE transactions on neural networks
دوره 10 1 شماره
صفحات -
تاریخ انتشار 1999