نتایج جستجو برای: naive bayesian classifier

تعداد نتایج: 145650  

2004
Marcel van Gerven Peter J. F. Lucas

The development of Bayesian classifiers is frequently accomplished by means of algorithms which are highly data-driven. Often, however, sufficient data are not available, which may be compensated for by eliciting background knowledge from experts. This paper explores the trade-offs between modelling using background knowledge from domain experts and machine learning using a small clinical datas...

2004
Harry Zhang Jiang Su

It is well-known that naive Bayes performs surprisingly well in classification, but its probability estimation is poor. In many applications, however, a ranking based on class probabilities is desired. For example, a ranking of customers in terms of the likelihood that they buy one’s products is useful in direct marketing. What is the general performance of naive Bayes in ranking? In this paper...

2007
Naren Ramakrishnan Shubhangi Deshpande

In the last lecture we discussed the relationships between different modeling paradigms such as the Bayesian approach, Maximum A Posteriori (MAP) approach, Maximum Likelihood (ML) approach, and the Leastsquares (LS) method. In this lecture we first prove that equivalence of LS and ML under the assumption of normally distributed error. Then, the notions of the naive Bayesian classifier and the L...

1999
H. A. Güvenir N. Emeksiz

This paper presents an expert system for differential diagnosis of erythemato-squamous diseases incorporating decisions made by three classification algorithms: nearest neighbor classifier, naive Bayesian classifier and voting feature intervals-5. This tool enables doctors to differentiate six types of erythemato-squamous diseases using clinical and histopathological parameters obtained from a ...

2007
Narin Emeksiz H. Altay Güvenir

This paper is about the implementation of a visual tool for Differential Diagnosis of Erythemato-Squamous Diseases based on the classification algorithms; Nearest Neighbor Classifier (NN), Naive Bayesian Classifier using Normal Distribution (NBC) and Voting Feature Intervals-5 (VFI5). This tool enables the doctors to differentiate six types of ErythematoSquamous Diseases using clinical and hist...

2003
C. Ratanamahatana D. Gunopulos

It is known that Naive Bayesian classifier (NB) works very well on some domains, and poorly on some. The performance of NB suffers in domains that involve correlated features. C4.5 decision trees, on the other hand, typically perform better than the Nafve Bayesian algorithm on such domains. This paper describes a Selective Bayesian classifier (SBC) that simply uses only those features that C4.5...

Journal: :CoRR 2004
Vikas Hamine Paul Helman

Naive Bayes is a simple Bayesian classifier with strong independence assumptions among the attributes. This classifier, despite its strong independence assumptions, often performs well in practice. It is believed that relaxing the independence assumptions of a naive Bayes classifier may improve the classification accuracy of the resulting structure. While finding an optimal unconstrained Bayesi...

2011
S. Krishnaveni

Data Mining is taking out of hidden patterns from huge database. It is commonly used in a marketing, surveillance, fraud detection and scientific discovery. In data mining, machine learning is mainly focused as research which is automatically learnt to recognize complex patterns and make intelligent decisions based on data. Nowadays traffic accidents are the major causes of death and injuries i...

1995
George H. John Pat Langley

When modeling a probability distribution with a Bayesian network, we are faced with the problem of how to handle continuous vari­ ables. Most previous work has either solved the problem by discretizing, or assumed that the data are generated by a single Gaussian. In this paper we abandon the normality as­ sumption and instead use statistical methods for nonparametric density estimation. For a n...

Journal: :Entropy 2015
Limin Wang Haoyu Zhao Minghui Sun Yue Ning

The inference of a general Bayesian network has been shown to be an NP-hard problem, even for approximate solutions. Although k-dependence Bayesian (KDB) classifier can construct at arbitrary points (values of k) along the attribute dependence spectrum, it cannot identify the changes of interdependencies when attributes take different values. Local KDB, which learns in the framework of KDB, is ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید