نتایج جستجو برای: boosting
تعداد نتایج: 14818 فیلتر نتایج به سال:
سابقه و هدف: گزینش ژنومی چالشی امید بخش برای کشف رموز ژنتیکی صفات کمی و کیفی به منظور بهبود رشد ژنتیکی و صحت پیش بینی ژنومی در اصلاح دام میباشد .در این پژوهش، عملکرد روشهای Boosting و بیز A در برآورد ارزشهای اصلاحی ژنومی صفات آستانهای دودویی و پیوسته در تراکم مختلف نشانگری با استفاده از معماریهای مختلف ژنومی مورد بررسی قرار گرفت. مواد و روشها: دادههای ژنومی از طریق نرم افزار QMSim با ...
چکیده هدف این تحقیق مقایسه سه روش یادگیری ماشین random forest، boosting و support vector machine در ارزیابی ژنومی و معرفی روش random forest به عنوان یک روش توانمند برای استنباط(پیش¬بینی) ژنوتیپ بود. نتایج برتری روش boosting بر دو روش دیگر را در غالب سناریوهای بررسی شده نشان داد، اگرچه تفاوتها فقط در برخی سناریوها معنی¬دار بود (05/0>p). همچنین علی¬رقم برتری روش boosting بر دو روش دیگر، میزان زم...
Smooth boosting algorithms are variants of boosting methods which handle only smooth distributions on the data. They are proved to be noise-tolerant and can be used in the “boosting by filtering” scheme, which is suitable for learning over huge data. However, current smooth boosting algorithms have rooms for improvements: Among non-smooth boosting algorithms, real AdaBoost or InfoBoost, can per...
Smooth boosting algorithms are variants of boosting methods which handle only smooth distributions on the data. They are proved to be noise-tolerant and can be used in the “boosting by filtering” scheme, which is suitable for huge data. However, current smooth boosting algorithms have rooms for improvements: Among non-smooth boosting algorithms, real AdaBoost or InfoBoost, can perform more effi...
A procedure for detecting outliers in regression problems is proposed. It is based on information provided by boosting regression trees. The key idea is to select the most frequently resampled observation along the boosting iterations and reiterate after removing it. The selection criterion is based on Tchebychev’s inequality applied to the maximum over the boosting iterations of ...
A general classification framework, called boosting chain, is proposed for learning boosting cascade. In this framework, a “chain” structure is introduced to integrate historical knowledge into successive boosting learning. Moreover, a linear optimization scheme is proposed to address the problems of redundancy in boosting learning and threshold adjusting in cascade coupling. By this means, the...
A general classification framework, called boosting chain, is proposed for learning boosting cascade. In this framework, a “chain” structure is introduced to integrate historical knowledge into successive boosting learning. Moreover, a linear optimization scheme is proposed to address the problems of redundancy in boosting learning and threshold adjusting in cascade coupling. By this means, the...
Boosting is a general method for improving the accuracy of any given learning algorithm. This short overview paper introduces the boosting algorithm AdaBoost, and explains the underlying theory of boosting, including an explanation of why boosting often does not suffer from overfitting as well as boosting’s relationship to support-vector machines. Some examples of recent applications of boostin...
Training an ensemble of neural networks is an interesting way to build a Multi-net System. One of the key factors to design an ensemble is how to combine the networks to give a single output. Although there are some important methods to build ensembles, Boosting is one of the most important ones. Most of methods based on Boosting use an specific combiner (Boosting Combiner). Although the Boosti...
Boosting is a general method for improving the accuracy of any given learning algorithm. This short paper introduces the boosting algorithm AdaBoost, and explains the underlying theory of boosting, including an explanation of why boosting often does not suffer from overfitting. Some examples of recent applications of boosting are also described.
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید