نتایج جستجو برای: loss minimization
تعداد نتایج: 475131 فیلتر نتایج به سال:
An electric distribution system plays an important role in achieving satisfactory power supply. The quality of power is measured by voltage stability and profile of voltage. But because of losses in distribution system, its voltage profile affects. Basically, the losses could be defined as the difference between the metered units energy input into the distribution system and the total energy pa...
Learning under a Wasserstein loss, a.k.a. Wasserstein loss minimization (WLM), is an emerging research topic for gaining insights from a large set of structured objects. Despite being conceptually simple, WLM problems are computationally challenging because they involve minimizing over functions of quantities (i.e. Wasserstein distances) that themselves require numerical algorithms to compute. ...
One of the most active areas of research in supervised learning has been the study of methods for constructing good ensembles of classiiers, that is, a set of classi-ers whose individual decisions are combined to increase overall accuracy of classifying new examples. In many applications classiiers are required to minimize an asym-metric loss function rather than the raw misclassiication rate. ...
We propose a boosting method, DirectBoost, a greedy coordinate descent algorithm that builds an ensemble classifier of weak classifiers through directly minimizing empirical classification error over labeled training examples; once the training classification error is reduced to a local coordinatewise minimum, DirectBoost runs a greedy coordinate ascent algorithm that continuously adds weak cla...
Uniform sampling of training data has been commonly used in traditional stochastic optimization algorithms such as Proximal Stochastic Mirror Descent (prox-SMD) and Proximal Stochastic Dual Coordinate Ascent (prox-SDCA). Although uniform sampling can guarantee that the sampled stochastic quantity is an unbiased estimate of the corresponding true quantity, the resulting estimator may have a rath...
Sleep loss is a ubiquitous phenomenon that occurs on many long-temr field missions. The effects of sleep loss are, in general, detrimental t o efficient functioning of man-machine systems. To illustrate the effect of sleep loss on task performance, data from four independent research institutes are reviewed. Data are presented relating t o the prevention of sleep loss, and to the detection and ...
In this paper, we study the problem of learning a metric and propose a loss function based metric learning framework, in which the metric is estimated by minimizing an empirical risk over a training set. With mild conditions on the instance distribution and the used loss function, we prove that the empirical risk converges to its expected counterpart at rate of root-n. In addition, with the ass...
In recent years there has been a growing interest in quantification, a variant of classification in which the final goal is not accurately classifying each unlabelled document but accurately estimating the prevalence (or “relative frequency”) of each class c in the unlabelled set. Quantification has several applications in information retrieval, data mining, machine learning, and natural langua...
In this article, we explore the concept of minimization of information loss (MIL) as a a target for neural network learning. We relate MIL to supervised and unsupervised learning procedures such as the Bayesian maximum a-posteriori (MAP) discriminator, minimization of distortion measures such as mean squared error (MSE) and cross-entropy (CE), and principal component analysis (PCA). To deal wit...
Supervised training of deep neural nets typically relies on minimizing cross-entropy. However, in many domains, we are interested in performing well on metrics specific to the application. In this paper we propose a direct loss minimization approach to train deep neural networks, which provably minimizes the application-specific loss function. This is often non-trivial, since these functions ar...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید