نتایج جستجو برای: modified subgradient method

تعداد نتایج: 1831354  

Journal: :Symmetry 2021

Our main focus in this work is the classical variational inequality problem with Lipschitz continuous and pseudo-monotone mapping real Hilbert spaces. An adaptive reflected subgradient-extragradient method presented along its weak convergence analysis. The novelty of proposed lies fact that only one projection onto feasible set each iteration required, there no need to know/approximate constant...

Journal: :SIAM Journal on Optimization 2009
Björn Johansson Maben Rabi Mikael Johansson

We present an algorithm that generalizes the randomized incremental subgradient method with fixed stepsize due to Nedić and Bertsekas [SIAM J. Optim., 12 (2001), pp. 109–138]. Our novel algorithm is particularly suitable for distributed implementation and execution, and possible applications include distributed optimization, e.g., parameter estimation in networks of tiny wireless sensors. The s...

Journal: :J. Global Optimization 2006
Regina Sandra Burachik Rafail N. Gasimov Nergiz A. Ismayilova C. Yalçin Kaya

We study convergence properties of a modified subgradient algorithm, applied to the dual problem defined by the sharp augmented Lagrangian. The primal problem we consider is nonconvex and nondifferentiable, with equality constraints. We obtain primal and dual convergence results, as well as a condition for existence of a dual solution. Using a practical selection of the step-size parameters, we...

Journal: :Numerische Mathematik 2010
Fethallah Benmansour Guillaume Carlier Gabriel Peyré Filippo Santambrogio

This paper describes the Subgradient Marching algorithm to compute the derivative of the geodesic distance with respect to the metric. The geodesic distance being a concave function of the metric, this algorithm computes an element of the subgradient in O(N log(N)) operations on a discrete grid ofN points. It performs a front propagation that computes the subgradient of a discrete geodesic dist...

Journal: :Signal Processing 2006
Alper T. Erdogan

We introduce a novel subgradient optimization-based framework for iterative peak-to-average power ratio (PAR) reduction for multicarrier systems, such as wireless orthogonal frequency division multiplexing (OFDM) and wireline discrete multitone (DMT) very high-speed digital subscriber line (DMT-VDSL) systems. The proposed approach uses reserved or unused tones to minimize the peak magnitude of ...

Journal: :Siam Journal on Control and Optimization 2021

We propose a single time-scale stochastic subgradient method for constrained optimization of composition several nonsmooth and nonconvex functions. The functions are assumed to be locally Lipschitz differentiable in generalized sense. Only estimates the values derivatives used. is parameter-free. prove convergence with probability one method, by associating it system differential inclusions dev...

Journal: :Numerical Algorithms 2021

Abstract We use techniques originating from the subdiscipline of mathematical logic called ‘proof mining’ to provide rates metastability and—under a metric regularity assumption—rates convergence for subgradient-type algorithm solving equilibrium problem in convex optimization over fixed-point sets firmly nonexpansive mappings. The is due H. Iiduka and I. Yamada who 2009 gave noneffective proof...

Journal: :CoRR 2018
Damek Davis Dmitriy Drusvyatskiy

We prove that the projected stochastic subgradient method, applied to a weakly convex problem, drives the gradient of the Moreau envelope to zero at the rateO(k−1/4).

2010
Michel Baes Michael Buergisser

We show that the Hedge Algorithm, a method widely used in Machine Learning, can be interpreted as a particular subgradient algorithm for minimizing a well-chosen convex function, namely a Mirror Descent Scheme. Using this reformulation, we can improve slightly the worstcase convergence guarantees of the Hedge Algorithm. Recently, Nesterov has introduced the class of Primal-Dual Subgradient Algo...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید