نتایج جستجو برای: and boosting

تعداد نتایج: 16829190  

2009
Zheng Wang Yangqiu Song Changshui Zhang

This paper presents an active learning strategy for boosting. In this strategy, we construct a novel objective function to unify semi-supervised learning and active learning boosting. Minimization of this objective is achieved through alternating optimization with respect to the classifier ensemble and the queried data set iteratively. Previous semi-supervised learning or active learning method...

2007
Jin Hyeong Park Chandan K. Reddy

Boosting is a simple yet powerful modeling technique that is used in many machine learning and data mining related applications. In this paper, we propose a novel scale-space based boosting framework which applies scale-space theory for choosing the optimal regressors during the various iterations of the boosting algorithm. In other words, the data is considered at different resolutions for eac...

1998
Zijian Zheng Geoffrey I. Webb Kai Ming Ting

Techniques for constructing classiier committees including Boosting and Bagging have demonstrated great success, especially Boosting for decision tree learning. This type of technique generates several classiiers to form a committee by repeated application of a single base learning algorithm. The committee members vote to decide the nal classiication. Boosting and Bagging create diierent classi...

Journal: :Comput. Sci. Inf. Syst. 2006
Kristína Machova Miroslav Puszta Frantisek Barcák Peter Bednár

In this paper we present an improvement of the precision of classification algorithm results. Two various approaches are known: bagging and boosting. This paper describes a set of experiments with bagging and boosting methods. Our use of these methods aims at classification algorithms generating decision trees. Results of performance tests focused on the use of the bagging and boosting methods ...

2007
Zafer Barutçuoglu Philip M. Long Rocco A. Servedio

This paper studies boosting algorithms that make a single pass over a set of base classifiers. We first analyze a one-pass algorithm in the setting of boosting with diverse base classifiers. Our guarantee is the same as the best proved for any boosting algorithm, but our one-pass algorithm is much faster than previous approaches. We next exhibit a random source of examples for which a “picky” v...

Journal: :CoRR 2017
Kaidong Wang Yao Wang Qian Zhao Deyu Meng Zongben Xu

It is known that Boosting can be interpreted as a gradient descent technique to minimize an underlying loss function. Specifically, the underlying loss being minimized by the traditional AdaBoost is the exponential loss, which is proved to be very sensitive to random noise/outliers. Therefore, several Boosting algorithms, e.g., LogitBoost and SavageBoost, have been proposed to improve the robus...

1999
Kai Ming Ting Zijian Zheng

This paper investigates boosting naive Bayesian classiica-tion. It rst shows that boosting cannot improve the accuracy of the naive Bayesian classiier on average in a set of natural domains. By analyzing the reasons of boosting's failures, we propose to introduce tree structures into naive Bayesian classiication to improve the performance of boosting when working with naive Bayesian classiicati...

Journal: :Neural computation 2007
Takafumi Kanamori Takashi Takenouchi Shinto Eguchi Noboru Murata

Boosting is known as a gradient descent algorithm over loss functions. It is often pointed out that the typical boosting algorithm, Adaboost, is highly affected by outliers. In this letter, loss functions for robust boosting are studied. Based on the concept of robust statistics, we propose a transformation of loss functions that makes boosting algorithms robust against extreme outliers. Next, ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید

function paginate(evt) { url=/search_year_filter/ var term=document.getElementById("search_meta_data").dataset.term pg=parseInt(evt.target.text) var data={ "year":filter_year, "term":term, "pgn":pg } filtered_res=post_and_fetch(data,url) window.scrollTo(0,0); } function update_search_meta(search_meta) { meta_place=document.getElementById("search_meta_data") term=search_meta.term active_pgn=search_meta.pgn num_res=search_meta.num_res num_pages=search_meta.num_pages year=search_meta.year meta_place.dataset.term=term meta_place.dataset.page=active_pgn meta_place.dataset.num_res=num_res meta_place.dataset.num_pages=num_pages meta_place.dataset.year=year document.getElementById("num_result_place").innerHTML=num_res if (year !== "unfilter"){ document.getElementById("year_filter_label").style="display:inline;" document.getElementById("year_filter_place").innerHTML=year }else { document.getElementById("year_filter_label").style="display:none;" document.getElementById("year_filter_place").innerHTML="" } } function update_pagination() { search_meta_place=document.getElementById('search_meta_data') num_pages=search_meta_place.dataset.num_pages; active_pgn=parseInt(search_meta_place.dataset.page); document.getElementById("pgn-ul").innerHTML=""; pgn_html=""; for (i = 1; i <= num_pages; i++){ if (i===active_pgn){ actv="active" }else {actv=""} pgn_li="
  • " +i+ "
  • "; pgn_html+=pgn_li; } document.getElementById("pgn-ul").innerHTML=pgn_html var pgn_links = document.querySelectorAll('.mypgn'); pgn_links.forEach(function(pgn_link) { pgn_link.addEventListener('click', paginate) }) } function post_and_fetch(data,url) { showLoading() xhr = new XMLHttpRequest(); xhr.open('POST', url, true); xhr.setRequestHeader('Content-Type', 'application/json; charset=UTF-8'); xhr.onreadystatechange = function() { if (xhr.readyState === 4 && xhr.status === 200) { var resp = xhr.responseText; resp_json=JSON.parse(resp) resp_place = document.getElementById("search_result_div") resp_place.innerHTML = resp_json['results'] search_meta = resp_json['meta'] update_search_meta(search_meta) update_pagination() hideLoading() } }; xhr.send(JSON.stringify(data)); } function unfilter() { url=/search_year_filter/ var term=document.getElementById("search_meta_data").dataset.term var data={ "year":"unfilter", "term":term, "pgn":1 } filtered_res=post_and_fetch(data,url) } function deactivate_all_bars(){ var yrchart = document.querySelectorAll('.ct-bar'); yrchart.forEach(function(bar) { bar.dataset.active = false bar.style = "stroke:#71a3c5;" }) } year_chart.on("created", function() { var yrchart = document.querySelectorAll('.ct-bar'); yrchart.forEach(function(check) { check.addEventListener('click', checkIndex); }) }); function checkIndex(event) { var yrchart = document.querySelectorAll('.ct-bar'); var year_bar = event.target if (year_bar.dataset.active == "true") { unfilter_res = unfilter() year_bar.dataset.active = false year_bar.style = "stroke:#1d2b3699;" } else { deactivate_all_bars() year_bar.dataset.active = true year_bar.style = "stroke:#e56f6f;" filter_year = chart_data['labels'][Array.from(yrchart).indexOf(year_bar)] url=/search_year_filter/ var term=document.getElementById("search_meta_data").dataset.term var data={ "year":filter_year, "term":term, "pgn":1 } filtered_res=post_and_fetch(data,url) } } function showLoading() { document.getElementById("loading").style.display = "block"; setTimeout(hideLoading, 10000); // 10 seconds } function hideLoading() { document.getElementById("loading").style.display = "none"; } -->