نتایج جستجو برای: convex quadratic programming

تعداد نتایج: 416944  

2015
Alnur Ali J. Zico Kolter Steven Diamond Stephen P. Boyd

We introduce disciplined convex stochastic programming (DCSP), a modeling framework that can significantly lower the barrier for modelers to specify and solve convex stochastic optimization problems, by allowing modelers to naturally express a wide variety of convex stochastic programs in a manner that reflects their underlying mathematical representation. DCSP allows modelers to express expect...

Journal: :CoRR 2017
Xingguo Li Lin F. Yang Jason Ge Jarvis D. Haupt Tong Zhang Tuo Zhao

We propose a DC proximal Newton algorithm for solving nonconvex regularized sparse learning problems in high dimensions. Our proposed algorithm integrates the proximal Newton algorithm with multi-stage convex relaxation based on difference of convex (DC) programming, and enjoys both strong computational and statistical guarantees. Specifically, by leveraging a sophisticated characterization of ...

2001
Nicholas I. M. Gould Philippe L. Toint

A method for restoring an optical image which is subjected to low-pass frequency filtering is presented. It is assumed that the object whose image is restored is of finite spatial extent. The problem is treated as an algebraic image-restoration problem which is then solved as a quadratic programming problem with bounded variables. The regularization technique for the ill-posed system is to repl...

Journal: :J. Global Optimization 2007
Amir Beck

We consider the outer approximation problem of finding a minimum radius ball enclosing a given intersection of at most n− 1 balls in R. We show that if the aforementioned intersection has a nonempty interior, then the problem reduces to minimizing a convex quadratic function over the unit simplex. This result is established by using convexity and representation theorems for a class of quadratic...

Journal: :Reliable Computing 2010
Stefania Corsaro Marina Marino

In this paper we present a mathematical model for archetypal analysis of data represented by means of intervals of real numbers. We extend the model for single-valued data proposed in the pioneering work of Cutler and Breiman on this topic. The core problem is a non-convex optimization one, which we solve by means of a sequential quadratic programming method. We show numerical experiments perfo...

Journal: :SIAM Journal on Optimization 2008
Radek Kucera

A new active set algorithm for minimizing quadratic functions with separable convex constraints is proposed by combining the conjugate gradient method with the projected gradient. It generalizes recently developed algorithms of quadratic programming constrained by simple bounds. A linear convergence rate in terms of the Hessian spectral condition number is proven. Numerical experiments, includi...

2000
Nathan W. Brixius Kurt M. Anstreicher

We describe a branch-and-bound algorithm for the quadratic assignment problem (QAP) that uses a convex quadratic programming (QP) relaxation to obtain a bound at each node. The QP subproblems are approximately solved using the Frank-Wolfe algorithm, which in this case requires the solution of a linear assignment problem on each iteration. Our branching strategy makes extensive use of dual infor...

2010
GABRIELE EICHFELDER JANEZ POVH

The well-known result stating that any non-convex quadratic problem over the nonnegative orthant with some additional linear and binary constraints can be rewritten as linear problem over the cone of completely positive matrices (Burer, 2009) is generalized by replacing the nonnegative orthant with an arbitrary closed convex cone. This set-semidefinite representation result implies new semidefi...

2017
Xingguo Li Lin Yang Jason Ge Jarvis D. Haupt Tong Zhang Tuo Zhao

We propose a DC proximal Newton algorithm for solving nonconvex regularized sparse learning problems in high dimensions. Our proposed algorithm integrates the proximal newton algorithm with multi-stage convex relaxation based on the difference of convex (DC) programming, and enjoys both strong computational and statistical guarantees. Specifically, by leveraging a sophisticated characterization...

In this paper, we deal to obtain some new complexity results for solving semidefinite optimization (SDO) problem by interior-point methods (IPMs). We define a new proximity function for the SDO by a new kernel function. Furthermore we formulate an algorithm for a primal dual interior-point method (IPM) for the SDO by using the proximity function and give its complexity analysis, and then we sho...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید