نتایج جستجو برای: logitboost
تعداد نتایج: 116 فیلتر نتایج به سال:
Rotation Forest is a recently proposed method for building classifier ensembles using independently trained decision trees. It was found to be more accurate than bagging, AdaBoost and Random Forest ensembles across a collection of benchmark data sets. This paper carries out a lesion study on Rotation Forest in order to find out which of the parameters and the randomization heuristics are respon...
The present study has proposed three novel hybrid models by integrating traditional ensemble models, such as random forest, logitboost, and naive bayes, six newly developed of rotation forest (RF), decision tree (RF-DT), J48 (DF-J48), bayes (RF-NBT), neural network (RF-NN), M5P (RF-M5P) REPTree (RF-REPTree), with statistical i.e. weight evidence, logistic regression combination WOE LR. To predi...
We study boosting algorithms from a new perspective. We show that the Lagrange dual problems of AdaBoost, LogitBoost and soft-margin LPBoost with generalized hinge loss are all entropy maximization problems. By looking at the dual problems of these boosting algorithms, we show that the success of boosting algorithms can be understood in terms of maintaining a better margin distribution by maxim...
Basic human activity recognition performed by ubiquitous devices represents one important area of research seeking for systems that are simple to use, reliable, accurate, and of low cost. Each human possesses individual and distinct characteristics in the way that performs physical activities such as walking or running. Therefore, it is important that the activity recognition system can adapt t...
It is known that Boosting can be interpreted as a gradient descent technique to minimize an underlying loss function. Specifically, the being minimized by traditional AdaBoost exponential loss, which proved very sensitive random noise/outliers. Therefore, several algorithms, e.g., LogitBoost and SavageBoost, have been proposed improve robustness of replacing with some designed robust functions....
Many classification algorithms achieve poor generalization accuracy on “noisy” data sets. We introduce a new non-convex boosting algorithm BrownBoost-δ, a noiseresistant booster, that is able to significantly increase accuracy on a set of noisy classification problems. Our algorithm consistently outperforms the original BrownBoost algorithm, AdaBoost, and LogitBoost on simulated and real data. ...
This paper describes a system that automatically classifies text readability for European Portuguese, while highlighting the key challenges on language features’ selection and text classification. To this goal, the system uses existing Natural Language Processing (NLP) tools to extract linguistic features from texts, which are then used by an automatic readability classifier. Currently, the sys...
Classification and prediction of protein domain structural class is one of the important topics in the molecular biology. We introduce the Bagging (Bootstrap aggregating), one of the bootstrap methods, for classifying and predicting protein structural classes. By a bootstrap aggregating procedure, the Bagging can improve a weak classifier, for instance the random tree method, to a significant s...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید