Boosting Lazy Decision Trees
نویسندگان
چکیده
This paper explores the problem of how to construct lazy decision tree ensembles. We present and empirically evaluate a relevancebased boosting-style algorithm that builds a lazy decision tree ensemble customized for each test instance. From the experimental results, we conclude that our boosting-style algorithm significantly improves the performance of the base learner. An empirical comparison to boosted regular decision trees shows that ensembles of lazy decision trees achieve comparable accuracy and better comprehensibility. We also introduce a novel distance-based pruning strategy for the lazy decision tree algorithm to address the problem of over-fitting. Our experiments show that the pruning strategy improves the accuracy and comprehensibility of both single lazy decision trees and boosted ensembles.
منابع مشابه
Lazy Bayesian Rules: A Lazy Semi-Naive Bayesian Learning Technique Competitive to Boosting Decision Trees
Lbr is a lazy semi-naive Bayesian classiier learning technique, designed to alleviate the attribute interdependence problem of naive Bayesian classiication. To classify a test example , it creates a conjunctive rule that selects a most appropriate subset of training examples and induces a local naive Bayesian classiier using this subset. Lbr can signii-cantly improve the performance of the naiv...
متن کاملGlobal Optimization in Learning with Important Data: an FCA-Based Approach
Nowadays decision tree learning is one of the most popular classification and regression techniques. Though decision trees are not accurate on their own, they make very good base learners for advanced tree-based methods such as random forests and gradient boosted trees. However, applying ensembles of trees deteriorates interpretability of the final model. Another problem is that decision tree l...
متن کاملBatched Lazy Decision Trees
We introduce a batched lazy algorithm for supervised classification using decision trees. It avoids unnecessary visits to irrelevant nodes when it is used to make predictions with either eagerly or lazily trained decision trees. A set of experiments demonstrate that the proposed algorithm can outperform both the conventional and lazy decision tree algorithms in terms of computation time as well...
متن کاملBoosting with Multi-Way Branching in Decision Trees
It is known that decision tree learning can be viewed as a form of boosting. However, existing boosting theorems for decision tree learning allow only binary-branching trees and the generalization to multi-branching trees is not immediate. Practical decision tree algorithms, such as CART and C4.5, implement a trade-off between the number of branches and the improvement in tree quality as measur...
متن کاملA Boosting method in Combination with Decision Trees
This paper describes boosting – a method, which can improve results of classification algorithms. The use of this method aims at classification algorithms generating decision trees. A modification of the AdaBoost algorithm was implemented. Results of performance tests focused on the use of the boosting method on binary decision trees are presented. The minimum number of decision trees, which en...
متن کامل