نتایج جستجو برای: مدل bagging
تعداد نتایج: 122039 فیلتر نتایج به سال:
Ensembles of classifiers are among the strongest classifiers in most data mining applications. Bagging ensembles exploit the instability of base-classifiers by training them on different bootstrap replicates. It has been shown that Bagging instable classifiers, such as decision trees, yield generally good results, whereas bagging stable classifiers, such as k-NN, makes little difference. Howeve...
Riassunto: Il Bagging è una tecnica di aggregazione, in cui uno stimatore viene ottenuto come media di predittori calcolati su campioni bootstrap. Gli alberi di decisione con il bagging quasi sempre migliorano il predittore originario, ed è opinione comune che l’efficacia del bagging sia dovuta alla riduzione della varianza. In questo lavoro mostriamo un contro-esempio e diamo evidenza sperimen...
In this paper, a new variant of Bagging named DepenBag is proposed. This algorithm obtains bootstrap samples at first. Then, it employs a causal discoverer to induce from each sample a dependency model expressed as a Directed Acyclic Graph (DAG). The attributes without connections to the class attribute in all the DAGs are then removed. Finally, a component learner is trained from each of the r...
As growing numbers of real world applications involve imbalanced class distribution or unequal costs for misclassification errors in different classes, learning from imbalanced class distribution is considered to be one of the most challenging issues in data mining research. This study empirically investigates the sensitivity of bagging predictors with respect to 12 algorithms and 9 levels of c...
In this paper, we apply the combination method of bagging which has been developed in the context of supervised learning of classifiers and regressors to the unsupervised artificial neural network known as the Self Organising Map. We show that various initialisation techniques can be used to create maps which are comparable by humans by eye. We then use a semi-supervised version of the SOM to c...
In previous researches it can been seen that Bagging, Boosting and Cross-Validation Committee can provide good performance separately. In this paper, Boosting methods are mixed with Bagging and Cross-Validation Committee in order to generate accurate ensembles and take benefit from all these alternatives. In this way, the networks are trained according to the boosting methods but the specific t...
Ensemble classification combines individually trained classifiers to obtain more accurate predictions than individual classifiers alone. Ensemble techniques are very useful for improving the generalizability of the classifier. Bagging is the method used most commonly for constructing ensemble classifiers. In bagging, different training data subsets are drawn randomly with replacement from the o...
An extreme learning machine (ELM) is a recently proposed learning algorithm for a single-layer feed forward neural network. In this paper we studied the ensemble of ELM by using a bagging algorithm for facial expression recognition (FER). Facial expression analysis is widely used in the behavior interpretation of emotions, for cognitive science, and social interactions. This paper presents a me...
10 Bagging forms a committee of classifiers by bootstrap aggregation of training sets from a pool of training data. A 11 simple alternative to bagging is to partition the data into disjoint subsets. Experiments with decision tree and neural 12 network classifiers on various datasets show that, given the same size partitions and bags, disjoint partitions result in 13 performance equivalent to, o...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید