نتایج جستجو برای: paced learning
تعداد نتایج: 605450 فیلتر نتایج به سال:
[1] Bengio, Yoshua, Louradour, Jérôme, Collobert, Ronan, and Weston, Jason. Curriculum learning. In ICML , 2009. [2] Kumar, M Pawan, Packer, Benjamin, and Koller, Daphne. Self-paced learning for latent variable models. In NIPS , 2010. [3] Shrivastava, Abhinav, Gupta, Abhinav, and Girshick, Ross. Training regionbased object detectors with online hard example mining. In CVPR , 2016. [4] Avramova,...
Mixture of regressions (MoR) is the wellestablished and effective approach to model discontinuous and heterogeneous data in regression problems. Existing MoR approaches assume smooth joint distribution for its good anlaytic properties. However, such assumption makes existing MoR very sensitive to intra-component outliers (the noisy training data residing in certain components) and the inter-com...
We propose a scalable approach to learn video-based question answering (QA): to answer a free-form natural language question about the contents of a video. Our approach automatically harvests a large number of videos and descriptions freely available online. Then, a large number of candidate QA pairs are automatically generated from descriptions rather than manually annotated. Next, we use thes...
The background focus of this discussion about work-integrated learning is the three streams of undergraduate Built Environment programs at Central Queensland University that are accredited by their relevant industries. CQU’s students’ truly work-integrated learning experience may be considered to be a ‘self-paced flexible learning while earning’ process. Relevant background theories of philosop...
Self-paced learning (SPL) is a new methodology that simulates the learning principle of humans/animals to start learning easier aspects of a learning task, and then gradually take more complex examples into training. This new-coming learning regime has been empirically substantiated to be effective in various computer vision and pattern recognition tasks. Recently, it has been proved that the S...
It is known that Boosting can be interpreted as a gradient descent technique to minimize an underlying loss function. Specifically, the being minimized by traditional AdaBoost exponential loss, which proved very sensitive random noise/outliers. Therefore, several algorithms, e.g., LogitBoost and SavageBoost, have been proposed improve robustness of replacing with some designed robust functions....
In this paper, we address a special scenario of semi-supervised learning, where the label missing is caused by preceding filtering mechanism, i.e., an instance can enter subsequent process in which its revealed if and only it passes mechanism. The rejected instances are prohibited to labeling due economical or ethical reasons, making support labeled unlabeled distributions isolated from each ot...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید