نتایج جستجو برای: quadratic loss function

تعداد نتایج: 1596512  

2002
Alexander I. Barvinok Tamon Stephen

We obtain a number of results regarding the distribution of values of a quadratic function f on the set of n × n permutation matrices (identified with the symmetric group Sn) around its optimum (minimum or maximum). In particular, we estimate the fraction of permutations σ such that f(σ) lies within a given neighborhood of the optimal value of f . We identify some “extreme” functions f (there a...

2011
Eugenio Mijangos

An algorithm for solving quadratic, two-stage stochastic problems is developed. The algorithm is based on the framework of the Branch and Fix Coordination (BFC) method. These problems have continuous and binary variables in the first stage and only continuous variables in the second one. The objective function is quadratic and the constraints are linear. The nonanticipativity constraints are fu...

2014
Roger FLETCHER Nerijus GALIAUSKAS Julius ŽILINSKAS

In this paper, we consider an optimization problem arising in multidimensional scaling with city-block distances. The objective function of this problem has many local minimum points and may be even non-differentiable at a minimum point. We reformulate the problem into a problem with convex quadratic objective function, linear and complementarity constraints. In addition, we propose an algorith...

2011
Enno Mammen Christoph Rothe Melanie Schienle

Semiparametric Estimation with Generated Covariates In this paper, we study a general class of semiparametric optimization estimators of a vectorvalued parameter. The criterion function depends on two types of infinite-dimensional nuisance parameters: a conditional expectation function that has been estimated nonparametrically using generated covariates, and another estimated function that is u...

2001
Wei Chu S. Sathiya Keerthi Chong Jin Ong

In this paper, we propose a unified non-quadratic loss function for regression known as soft insensitive loss function (SILF). SILF is a flexible model and possesses most of the desirable characteristics of popular non-quadratic loss functions, such as Laplacian, Huber’s and Vapnik’s ε-insensitive loss function. We describe the properties of SILF and illustrate our assumption on the underlying ...

2011
Hsin-Hung Chen Hsien-Tang Tsai Dennis K. J. Lin

Fund managers highly prioritize selecting portfolios with a high Sharpe ratio. Traditionally, this task can be achieved by revising the objective function of the Markowitz mean-variance portfolio model and then resolving quadratic programming problems to obtain the maximum Sharpe ratio portfolio. This study presents a closed-form solution for the optimal Sharpe ratio portfolio by applying Cauch...

2014
Matthias H. Y. Tan

In robust parameter design, the quadratic loss function is commonly used. However, this loss function is not always realistic and the expected loss may not exist in some cases. This paper proposes the use of a general class of bounded loss functions that are cumulative distribution functions and probability density functions. New loss functions are investigated and the loss functions are shown ...

Journal: :Neural computation 2017
Shaobo Lin Jinshan Zeng Xiangyu Chang

This letter aims at refined error analysis for binary classification using support vector machine (SVM) with gaussian kernel and convex loss. Our first result shows that for some loss functions, such as the truncated quadratic loss and quadratic loss, SVM with gaussian kernel can reach the almost optimal learning rate provided the regression function is smooth. Our second result shows that for ...

2005
Immanuel M. Bomze Marco Locatelli Fabio Tardella

A standard quadratic optimization problem (StQP) consists in minimizing a quadratic form over a simplex. A number of problems can be transformed into a StQP, including the general quadratic problem over a polytope and the maximum clique problem in a graph. In this paper we present several polynomial-time bounds for StQP ranging from very simple and cheap ones to more complex and tight construct...

Journal: :Journal of Machine Learning Research 2011
Tony Jebara

A multitask learning framework is developed for discriminative classification and regression where multiple large-margin linear classifiers are estimated for different prediction problems. These classifiers operate in a common input space but are coupled as they recover an unknown shared representation. A maximum entropy discrimination (MED) framework is used to derive the multitask algorithm w...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید