نتایج جستجو برای: modified subgradient method

تعداد نتایج: 1831354  

Journal: :J. Optimization Theory and Applications 2015
Mikhail A. Bragin Peter B. Luh Joseph H. Yan Nanpeng Yu Gary A. Stern

Studies have shown that the surrogate subgradient method, to optimize non-smooth dual functions within the Lagrangian relaxation framework, can lead to significant computational improvements as compared to the subgradient method. The key idea is to obtain surrogate subgradient directions that form acute angles toward the optimal multipliers without fully minimizing the relaxed problem. The majo...

2010
G. Svindland

We introduce a generalised subgradient for law-invariant closed convex risk measures on L and establish its relationship with optimal risk allocations and equilibria. Our main result gives sufficient conditions ensuring a non-empty generalised subgradient.

Journal: :Comp. Opt. and Appl. 2014
Dirk A. Lorenz Marc E. Pfetsch Andreas M. Tillmann

We propose a new subgradient method for the minimization of nonsmooth convex functions over a convex set. To speed up computations we use adaptive approximate projections only requiring to move within a certain distance of the exact projections (which decreases in the course of the algorithm). In particular, the iterates in our method can be infeasible throughout the whole procedure. Neverthele...

Journal: :SIAM Journal on Optimization 2014
Yurii Nesterov S. Shpirko

In this paper we develop a primal-dual subgradient method for solving huge-scale Linear Conic Optimization Problems. Our main assumption is that the primal cone is formed as a direct product of many small-dimensional convex cones, and that the matrix A of corresponding linear operator is uniformly sparse. In this case, our method can approximate the primal-dual optimal solution with accuracy ε ...

2016
Tianbao Yang Qihang Lin

In this paper, we study the efficiency of a Restarted SubGradient (RSG) method that periodically restarts the standard subgradient method (SG). We show that, when applied to a broad class of convex optimization problems, RSG method can find an ǫ-optimal solution with a low complexity than SG method. In particular, we first show that RSG can reduce the dependence of SG’s iteration complexity on ...

Journal: :Electronic Notes in Discrete Mathematics 2013
Monia Giandomenico Adam N. Letchford Fabrizio Rossi Stefano Smriglio

The famous Lovász theta number θ(G) is expressed as the optimal solution of a semidefinite program. As such, it can be computed in polynomial time to an arbitrary precision. Nevertheless, computing it in practice yields some difficulties as the size of the graph gets larger and larger, despite recent significant advances of semidefinite programming (SDP) solvers. We present a way around SDP whi...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید