Finite Mixture Model of Bounded Semi-Naive Bayesian Networks for Classification

نویسندگان

  • Kaizhu Huang
  • Irwin King
  • Michael R. Lyu
چکیده

The Naive Bayesian (NB) network classifier, a probabilistic model with a strong assumption of conditional independence among features, shows a surprisingly competitive prediction performance even when compared with some state-of-the-art classifiers. With a looser assumption of conditional independence, the Semi-Naive Beyesian (SNB) network classifier is superior to NB classifiers when features are combined. However, the problem for SNB is that its structure is still strongly constrained which may generate inaccurate distributions for some datasets. A natural progression to improve SNB is to extend it using the mixture approach. However, in obtaining the final structure, traditional SNBs use the heuristic approaches to learn the structure from data locally. On the other hand, ExpectationMaximization (EM) method is used in the mixture approach to obtain the structure iteratively. The extension is difficult to integrate the local heuristic into the maximization step since it may not convergence. In this paper we firstly develop a Bounded Semi-Naive Bayesian network (B-SNB) model, which contains the restriction on the number of variables that can be joined in a combined feature. As opposed to local property of the traditional SNB models, our model enjoys a global nature and maintains a polynomial time cost. Overcoming the difficulty of integrating SNBs into the mixture model, we then propose an algorithm to extend it into a finite mixture structure, named Mixture of Bounded Semi-Naive Bayesian network (MBSNB). We give theoretical derivations, outline of the algorithm, analysis of algoTO SUBMIT TO IEEE TRANSACTION ON NEURAL NETWORKS — [ 1.5] 2 rithm and a set of experiments to demonstrate the usefulness of MBSNB in some classification tasks. The novel finite MBSNB network shows good speed up, ability to converge and an increase in prediction accuracy.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Finite Mixture Model of Bounded Semi-naive Bayesian Networks Classifier

The Semi-Naive Bayesian network (SNB) classifier, a probabilistic model with an assumption of conditional independence among the combined attributes, shows a good performance in classification tasks. However, the traditional SNBs can only combine two attributes into a combined attribute. This inflexibility together with its strong independency assumption may generate inaccurate distributions fo...

متن کامل

‎A Bayesian mixture model‎ for classification of certain and uncertain data

‎There are different types of classification methods for classifying the certain data‎. ‎All the time the value of the variables is not certain and they may belong to the interval that is called uncertain data‎. ‎In recent years‎, ‎by assuming the distribution of the uncertain data is normal‎, ‎there are several estimation for the mean and variance of this distribution‎. ‎In this paper‎, ‎we co...

متن کامل

A Validation Test Naive Bayesian Classification Algorithm and Probit Regression as Prediction Models for Managerial Overconfidence in Iran's Capital Market

Corporate directors are influenced by overconfidence, which is one of the personality traits of individuals; it may take irrational decisions that will have a significant impact on the company's performance in the long run. The purpose of this paper is to validate and compare the Naive Bayesian Classification algorithm and probit regression in the prediction of Management's overconfident at pre...

متن کامل

Stability evaluation of Neural and statistical Classifiers based on Modified Semi - bounded Plug - in Algorithm

This paper illustrates a new criterion for evaluating neural networks stability compared to the Bayesian classifier. The stability comparison is performed by the error rate probability densities estimation using the modified semi-bounded Plug-in algorithm. We attempt, in this work, to demonstrate that the Bayesian approach for neural networks improves the performance and stability degree of the...

متن کامل

A Bayesian Network Classifier that Combines a Finite Mixture Model and a NaIve Bayes Model

In this paper we present a new Bayesian net­ work model for classification that combines the naive Bayes (NB} classifier and the fi­ nite mixture (FM} classifier. The resulting classifier aims at relaxing the strong assump­ tions on which the two component models are based, in an attempt to improve on their classification performance, both in terms of accuracy and in terms of calibration of the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003