نتایج جستجو برای: ensemble of decision tree

تعداد نتایج: 21209322  

Journal: :hospital practice and research 0
samaneh aghajani department of industrial engineering, tarbiat modares university, tehran, ir iran mehrdad kargari department of industrial engineering, tarbiat modares university, tehran, ir iran

background: length of stay is one of the most important indicators in assessing hospital performance. a shorter stay can reduce the costs per discharge and shift care from inpatient to less expensive post-acute settings. it can lead to a greater readmission rate, better resource management, and more efficient services. objective: this study aimed to identify the factors influencing length of ho...

2009
Francesco Gargiulo Ludmila I. Kuncheva Carlo Sansone

Classical approaches for network traffic classification are based on port analysis and packet inspection. Recent studies indicate that network protocols can be recognised more accurately using the flow statistics of the TCP connection. We propose a classifier selection ensemble for a fast and accurate verification of network protocols. Using the requested port number, the classifier selector di...

2004
Vicent Estruch César Ferri José Hernández-Orallo M. José Ramírez-Quintana

Ensemble methods improve accuracy by combining the predictions of a set of different hypotheses. A well-known method for generating hypothesis ensembles is Bagging. One of the main drawbacks of ensemble methods in general, and Bagging in particular, is the huge amount of computational resources required to learn, store, and apply the set of models. Another problem is that even using the bootstr...

Journal: :CoRR 2015
Tom Rainforth Frank D. Wood

We introduce canonical correlation forests (CCFs), a new decision tree ensemble method for classification. Individual canonical correlation trees are binary decision trees with hyperplane splits based on canonical correlation components. Unlike axisaligned alternatives, the decision surfaces of CCFs are not restricted to the coordinate system of the input features and therefore more naturally r...

Journal: :CoRR 2017
Rajiv Sambasivan Sourish Das

We present an algorithm for classification tasks on big data. Experiments conducted as part of this study indicate that the algorithm can be as accurate as ensemble methods such as random forests or gradient boosted trees. Unlike ensemble methods, the models produced by the algorithm can be easily interpreted. The algorithm is based on a divide and conquer strategy and consists of two steps. Th...

2014
Krzysztof Grabczewski

The book focuses on different variants of decision tree induction but also describes the metalearning approach in general which is applicable to other types of machine learning algorithms. The book discusses different variants of decision tree induction and represents a useful source of information to readers wishing to review some of the techniques used in decision tree learning, as well as di...

Journal: :Pattern Recognition Letters 2008
Chun-Xia Zhang Jiang-She Zhang

This paper presents a novel ensemble classifier generation technique RotBoost, which is constructed by combining Rotation Forest and AdaBoost. The experiments conducted with 36 real-world data sets available from the UCI repository, among which a classification tree is adopted as the base learning algorithm, demonstrate that RotBoost can generate ensemble classifiers with significantly lower pr...

Journal: :مدیریت فناوری اطلاعات 0
محمود البرزی واحد علوم تحقیقات محمد ابراهیم محمد پورزرندی دانشگاه آزاد تهران مرکز محمد خان بابایی واحد علوم و تحقیقات

decision trees as one of the data mining techniques, is used in credit scoring of bank customers. the main problem is the construction of decision trees in that they can classify customers optimally. this paper proposes an appropriate model based on genetic algorithm for credit scoring of banks customers in order to offer credit facilities to each class. genetic algorithm can help in credit sco...

2003
Lawrence O. Hall Kevin W. Bowyer Robert E. Banfield Divya Bhadoria W. Philip Kegelmeyer Steven Eschrich

We experimentally evaluate bagging and seven other randomization-based approaches to creating an ensemble of decision-tree classifiers. Unlike methods related to boosting, all of the eight approaches create each classifier in an ensemble independently of the other classifiers in the ensemble. Bagging uses randomization to create multiple training sets. Other approaches, such as those of Dietter...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید