Nonlinear Rescaling as Interior Quadratic Prox Method in Convex Optimization

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonlinear Rescaling as Interior Quadratic Prox Method in Convex Optimization

A class Ψ of strictly concave and twice continuously differentiable functions ψ : R → R with particular properties is used for constraint transformation in the framework of a Nonlinear Rescaling (NR) method with “dynamic” scaling parameter updates. We show that the NR method is equivalent to the Interior Quadratic Prox method for the dual problem in a rescaled dual space. The equivalence is use...

متن کامل

Proximal Point Nonlinear Rescaling Method for Convex Optimization

Nonlinear rescaling (NR) methods alternate finding an unconstrained minimizer of the Lagrangian for the equivalent problem in the primal space (which is an infinite procedure) with Lagrange multipliers update. We introduce and study a proximal point nonlinear rescaling (PPNR) method that preserves convergence and retains a linear convergence rate of the original NR method and at the same time d...

متن کامل

Primal-Dual Nonlinear Rescaling Method for Convex Optimization

In this paper we consider a general primal-dual nonlinear rescaling (PDNR) method for convex optimization with inequality constraints. We prove the global convergence of the PDNR method and estimate error bounds for the primal and dual sequences. In particular, we prove that, under the standard second-order optimality conditions the error bounds for the primal and dual sequences converge to zer...

متن کامل

Primal-Dual Nonlinear Rescaling Method for Convex Optimization

In this paper, we consider a general primal-dual nonlinear rescaling (PDNR) method for convex optimization with inequality constraints. We prove the global convergence of the PDNR method and estimate the error bounds for the primal and dual sequences. In particular, we prove that, under the standard second-order optimality conditions, the error bounds for the primal and dual sequences converge ...

متن کامل

Nonlinear rescaling vs. smoothing technique in convex optimization

We introduce an alternative to the smoothing technique approach for constrained optimization. As it turns out for any given smoothing function there exists a modification with particular properties. We use the modification for Nonlinear Rescaling (NR) the constraints of a given constrained optimization problem into an equivalent set of constraints. The constraints transformation is scaled by a ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computational Optimization and Applications

سال: 2006

ISSN: 0926-6003,1573-2894

DOI: 10.1007/s10589-006-9759-0