نتایج جستجو برای: loss minimization
تعداد نتایج: 475131 فیلتر نتایج به سال:
In this paper, we theoretically study the problem of binary classification in the presence of random classification noise — the learner, instead of seeing the true labels, sees labels that have independently been flipped with some small probability. Moreover, random label noise is class-conditional — the flip probability depends on the class. We provide two approaches to suitably modify any giv...
We extend the well-known BFGS quasiNewton method and its limited-memory variant (LBFGS) to the optimization of nonsmooth convex objectives. This is done in a rigorous fashion by generalizing three components of BFGS to subdifferentials: The local quadratic model, the identification of a descent direction, and the Wolfe line search conditions. We apply the resulting sub(L)BFGS algorithm to L2-re...
We extend the well-known BFGS quasi-Newton method and its limited-memory variant (LBFGS) to the optimization of nonsmooth convex objectives. This is done in a rigorous fashion by generalizing three components of BFGS to subdifferentials: The local quadratic model, the identification of a descent direction, and the Wolfe line search conditions. We apply the resulting subLBFGS algorithm to L2-reg...
The minimization of the logistic loss is a popular approach to batch supervised learning. Our paper starts from the surprising observation that, when fitting linear (or kernelized) classifiers, the minimization of the logistic loss is equivalent to the minimization of an exponential rado-loss computed (i) over transformed data that we call Rademacher observations (rados), and (ii) over the same...
به منظور بررسی تاثیر پروبیوتیک و سطوح مختلف پودر صمغ آنغوزه در مقایسه با آنتی بیوتیک (آویلامایسین) بر عملکرد رشد، وزن نسبی اندام های داخلی، میکروفلور روده، مورفولوژی پرزهای روده، کیفیت گوشت و تیترآنتی بادی بر علیه گلبول قرمزگوسفندی در جوجه های گوشتی، آزمایشی در قالب طرح کاملا تصادفی با 6 تیمار به اجرا درآمد. تیمارهای آزمایشی شامل: جیره پایه بدون افزودنی، جیره پایه حاوی 100 میلی گرم در کیلو گرم ...
In the present deregulated environment, optimal placement of Distributed Generation (DG) and shunt capacitor in the distribution network plays a vital role in distribution system planning. In this paper, an analytical approach for optimal placement of combined DG and Capacitor units are determined with the objective of power loss reduction and voltage profile improvement. Firstly, the DG unit i...
Convex loss minimization with l1 regularization has been proposed as a promising method for feature selection in classification (e.g., l1-regularized logistic regression) and regression (e.g., l1-regularized least squares). In this paper we describe an efficient interior-point method for solving large-scale l1-regularized convex loss minimization problems that uses a preconditioned conjugate gr...
distributed generations (dgs) are utilized to supply the active and reactive power in the transmission and distribution systems. these types of power sources have many benefits such as power quality enhancement, voltage deviation reduction, power loss reduction, load shedding reduction, reliability improvement, etc. in order to reach the above benefits, the optimal placement and sizing of dg is...
Multivariate loss functions are used to assess performance in many modern prediction tasks, including information retrieval and ranking applications. Convex approximations are typically optimized in their place to avoid NP-hard empirical risk minimization problems. We propose to approximate the training data instead of the loss function by posing multivariate prediction as an adversarial game b...
A new procedure for learning cost-sensitive SVM classifiers is proposed. The SVM hinge loss is extended to the cost sensitive setting, and the cost-sensitive SVM is derived as the minimizer of the associated risk. The extension of the hinge loss draws on recent connections between risk minimization and probability elicitation. These connections are generalized to costsensitive classification, i...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید