نتایج جستجو برای: modified subgradient method

تعداد نتایج: 1831354  

Journal: :Oper. Res. Lett. 2000
Hanif D. Sherali Gyunghyun Choi Cihan H. Tuncbilek

This paper presents a new Variable target value method (VTVM) that can be used in conjunction with pure or de ected subgradient strategies. The proposed procedure assumes no a priori knowledge regarding bounds on the optimal value. The target values are updated iteratively whenever necessary, depending on the information obtained in the process of the algorithm. Moreover, convergence of the seq...

Journal: :J. Optimization Theory and Applications 2015
Heinz H. Bauschke Caifang Wang Xianfu Wang Jia Xu

The subgradient projection iteration is a classical method for solving a convex inequality. Motivated by works of Polyak and of Crombez, we present and analyze a more general method for finding a fixed point of a cutter, provided that the fixed point set has nonempty interior. Our assumptions on the parameters are more general than existing ones. Various limiting examples and comparisons are pr...

2016
Antonin Chambolle

2 (First order) Descent methods, rates 2 2.1 Gradient descent . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 2.2 What can we achieve? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.3 Second order methods: Newton’s method . . . . . . . . . . . . . . . . . 7 2.4 Multistep first order methods . . . . . . . . . . . . . . . . . . . . . . . . 8 2.4.1 Heavy ball method . ...

Journal: :SIAM Journal on Optimization 2015
Heinz H. Bauschke Caifang Wang Xianfu Wang Jia Xu

The subgradient projector is of considerable importance in convex optimization because it plays the key role in Polyak’s seminal work — and the many papers it spawned — on subgradient projection algorithms for solving convex feasibility problems. In this paper, we offer a systematic study of the subgradient projector. Fundamental properties such as continuity, nonexpansiveness, and monotonicity...

2010
G. Svindland

We introduce a generalised subgradient for law-invariant closed convex risk measures on L and establish its relationship with optimal risk allocations and equilibria. Our main result gives sufficient conditions ensuring a non-empty generalised subgradient.

Journal: :CoRR 2017
Patrick R. Johnstone Pierre Moulin

The purpose of this manuscript is to derive new convergence results for several subgradient methods for minimizing nonsmooth convex functions with Hölderian growth. The growth condition is satisfied in many applications and includes functions with quadratic growth and functions with weakly sharp minima as special cases. To this end there are four main contributions. First, for a constant and su...

1999
Luiz Antonio N. Lorena Marcelo Gonçalves Narciso

The Traveling Salesman Problem (TSP) is a classical Combinatorial Optimization problem intensively studied. The Lagrangean relaxation was first applied to the TSP in 1970. The Lagrangean relaxation limit approximates what is known today as HK (Held and Karp) bound, a very good bound (less than 1% from optimal) for a large class of symmetric instances. It became a reference bound for new heurist...

1997
JOAKIM PETERSSON MICHAEL PATRIKSSON

We consider the solution of nite element discretized optimum sheet problems by an iterative algorithm. The problem is that of maximizing the stiiness of a sheet subject to constraints on the admissible designs and unilateral contact conditions on the displacements. The model allows for zero design volumes, and thus constitutes a true topology optimization problem. We propose and evaluate a subg...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید