نتایج جستجو برای: modified subgradient method

تعداد نتایج: 1831354  

Journal: :CoRR 2017
Benjamin Grimmer

We extend the classic convergence rate theory for subgradient methods to apply to non-Lipschitz functions. For the deterministic projected subgradient method, we present a global O(1/ √ T ) convergence rate for any convex function which is locally Lipschitz around its minimizers. This approach is based on Shor’s classic subgradient analysis and implies generalizations of the standard convergenc...

Journal: :RAIRO - Operations Research 2007
Paulo J. S. Silva Carlos Humes

We present an inexact interior point proximal method to solve linearly constrained convex problems. In fact, we derive a primal-dual algorithm to solve the KKT conditions of the optimization problem using a modified version of the rescaled proximal method. We also present a pure primal method. The proposed proximal method has as distinctive feature the possibility of allowing inexact inner step...

1999
X. Zhao P. B. Luh J. Wang D. D. Yao Yu-Chi Ho

The subgradient method is used frequently to optimize dual functions in Lagrangian relaxation for separable integer programming problems. In the method, all subproblems must be solved optimally to obtain a subgradient direction. In this paper, the surrogate subgradient method is developed, where a proper direction can be obtained without solving optimally all the subproblems. In fact, only an a...

2007
Paulo J.S. Silva Carlos Humes

We present an inexact interior point proximal method to solve linearly constrained convex problems. In fact, we derive a primaldual algorithm to solve the KKT conditions of the optimization problem using a modified version of the rescaled proximal method. We also present a pure primal method. The proposed proximal method has as distinctive feature the possibility of allowing inexact inner steps...

Journal: :Symmetry 2021

We propose a modified extragradient method for solving the variational inequality problem in Hilbert space. The is combination of well-known subgradient with Mann’s mean value which updated iterate picked convex hull all previous iterates. show weak convergence to solution problem, provided that condition on corresponding averaging matrix fulfilled. Some numerical experiments are given effectiv...

Journal: :CoRR 2012
Simon Lacoste-Julien Mark W. Schmidt Francis R. Bach

In this note, we present a new averaging technique for the projected stochastic subgradient method. By using a weighted average with a weight of t + 1 for each iterate wt at iteration t, we obtain the convergence rate of O(1/t) with both an easy proof and an easy implementation. The new scheme is compared empirically to existing techniques, with similar performance behavior.

2004
Giovanni Colombo Peter R. Wolenski

We derive a formula for the minimal time function where the dynamics are linear and the target is convex. Based on this formula, we give a new proof of the semiconvexity of the minimal time function, a result originally due to Cannarsa and Sinestrari.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید