نتایج جستجو برای: مدل bagging
تعداد نتایج: 122039 فیلتر نتایج به سال:
Machine Learning tools are increasingly being applied to analyze data from microarray experiments. These include ensemble methods where weighted votes of constructed base classifiers are used to classify data. We compare the performance of AdaBoost, bagging and BagBoost on gene expression data from the yeast cell cycle. AdaBoost was found to be more effective for the data than bagging. BagBoost...
The problem of large-scale simultaneous hypothesis testing is revisited. Bagging and subagging procedures are put forth with the purpose of improving the discovery power of the tests. The procedures are implemented in both simulated and real data. It is shown that bagging and subagging significantly improve power at the cost of a small increase in false discovery rate with the proposed ‘maximum...
We use amortized inference in conjunction with implicit models to approximate the bootstrap distribution over model parameters. We call this the amortized bootstrap, as statistical strength is shared across dataset replicates through a metamodel. At test time, we can then perform amortized bagging by drawing multiple samples from the implicit model. We find amortized bagging outperforms bagging...
هدف از این مطالعه مقایسه سه روش پارامتری (GBLUP، BayesB، RKHS) و دو روش بازنمونهگیری (Bagging GBLUP و Random Forest) در پیش بینی ارزشهای اصلاحی ژنومیک برای صفاتی با ساختار ژنتیکی متفاوت بود. یک ژنوم با سه کروموزوم، هر کروموزوم به طول یک مورگان شبیهسازی شد و روی آن 1500 نشانگر تک نوکلئوتیدی (SNP) در سه سناریو 50، 100 و 200QTL به طور یکنواخت پخش شدند. اثر جایگزینی QTLها با استفاده از توزیع نرم...
این پژوهش با الهام گرفتن از نتایج یک طرح مطالعاتی کاربردی، رویکرد سلسلهمراتبی جهت پیگیری فرایند توسعه تأمینکنندگان و حمایت تصمیمات موجود در هر مراحل آن ارائه میکند. ابتدا، زمینههای تأمین نیازمند سپس واجد شرایط هریک زمینهها به کمک تصمیمگیری چندشاخصه بهترین-بدترین مشخص میگردند. معیارهای شناسایی نیز مرور مطالعات پیشین بهرهگیری نظرات خبرگان حوزهی خرید استخراجشدهاند. درنهایت، مدل ریاضی ...
In bagging Bre94a] one uses bootstrap replicates of the training set Efr79, ET93] to improve a learning algorithm's performance, often by tens of percent. This paper presents several ways that stacking Wol92b, Bre92] can be used in concert with the bootstrap procedure to achieve a further improvement on the performance of bagging for some regression problems. In particular, in some of the work ...
Ensemble learning (process of combining multiple models into a single decision) is an effective tool for improving the classification performance of inductive models. While ideal for domains like bioinformatics with many challenging datasets, many ensemble methods, such as Bagging and Boosting, do not take into account the high-dimensionality (large number of features per instance) that is comm...
Bagging is a device intended for reducing the prediction error of learning algorithms. In its simplest form, bagging draws bootstrap samples from the training sample, applies the learning algorithm to each bootstrap sample, and then averages the resulting prediction rules. Heuristically, the averaging process should reduce the variance component of the prediction error. This is supported by emp...
Intuitively, we expect that averaging — or bagging — different regressors with low correlation should smooth their behavior and be somewhat similar to regularization. In this note we make this intuition precise. Using an almost classical definition of stability, we prove that a certain form of averaging provides generalization bounds with a rate of convergence of the same order as Tikhonov regu...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید