نتایج جستجو برای: decision trees
تعداد نتایج: 422691 فیلتر نتایج به سال:
In many cases it is better to extract a set of decision trees and a set of possible logical data descriptions instead of a single model. Methods for creating forests of decision trees based on Separability of Split Value (SSV) criterion are presented. Preliminary results confirm their usefulness in understanding data structures.
Intermediate decision trees are the subtrees of the ful l (unpruned) decision tree generated in a breadth-first order An extensive empirical investigation evaluates the classification error of intermediate decision trees and compares their performance to ful l and pruned trees Em pirical results were generated using C4 5 with 66 databases from the UCI machine learning database repository Result...
Recently proposed budding tree is a decision tree algorithm in which every node is part internal node and part leaf. This allows representing every decision tree in a continuous parameter space, and therefore a budding tree can be jointly trained with backpropagation, like a neural network. Even though this continuity allows it to be used in hierarchical representation learning, the learned rep...
Ensemble methods improve accuracy by combining the predictions of a set of different hypotheses. A well-known method for generating hypothesis ensembles is Bagging. One of the main drawbacks of ensemble methods in general, and Bagging in particular, is the huge amount of computational resources required to learn, store, and apply the set of models. Another problem is that even using the bootstr...
Decision trees are arguably one of the most popular choices for learning and reasoning systems, especially when it comes to learning from discrete valued (feature based) examples. Because of the success they acquired in their area, there have been many attempts to generalize the method to better suit working with real-valued, numerical attributes, but also for missing values or even numerical o...
Data mining term is mainly used for the specific set of six activities namely Classification, Estimation, Prediction, Affinity grouping or Association rules, Clustering, Description and Visualization. The first three tasks classification, estimation and prediction are all examples of directed data mining or supervised learning. Decision Tree (DT) is one of the most popular choices for learning ...
A celebrated theorem of Friedgut says that every function f : {0, 1}n → {0, 1} can be approximated by a function g : {0, 1}n → {0, 1} with ‖f−g‖2 ≤ ǫ which depends only on eO(If/ǫ) variables where If is the sum of the influences of the variables of f . Dinur and Friedgut later showed that this statement also holds if we replace the discrete domain {0, 1}n with the continuous domain [0, 1]n, und...
These notes introduce a new kind of classifier called a dyadic decision tree (DDT). We also introduce a discrimination rule for learning a DDT that achieves the optimal rate of convergence, ER(ĥn) − R∗ = O(n−1/d), for the box-counting class, which was defined in the previous set of notes. This improves on the rate of ER(ĥn)−R = O(n−1/(d+2)) for the histogram sieve estimator from the previous no...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید