Convergence Rate of an Optimization Algorithm for Minimizing Quadratic Functions with Separable Convex Constraints
نویسنده
چکیده
A new active set algorithm for minimizing quadratic functions with separable convex constraints is proposed by combining the conjugate gradient method with the projected gradient. It generalizes recently developed algorithms of quadratic programming constrained by simple bounds. A linear convergence rate in terms of the Hessian spectral condition number is proven. Numerical experiments, including the frictional three-dimensional (3D) contact problems of linear elasticity, illustrate the computational performance.
منابع مشابه
Applying a Newton Method to Strictly Convex Separable Network Quadratic Programs
Introduction This paper describes the application of Newton Method for solving strictly convex separable network quadratic programs. The authors provide a brief synopsis of separable network quadratic programming and list the various techniques for solving the same. The main thrust of the paper is succinctly identified by the following: 1. Providing a generic subroutine that can be used by vari...
متن کاملAn Interior Point Algorithm for Solving Convex Quadratic Semidefinite Optimization Problems Using a New Kernel Function
In this paper, we consider convex quadratic semidefinite optimization problems and provide a primal-dual Interior Point Method (IPM) based on a new kernel function with a trigonometric barrier term. Iteration complexity of the algorithm is analyzed using some easy to check and mild conditions. Although our proposed kernel function is neither a Self-Regular (SR) fun...
متن کاملRegularized Interior Proximal Alternating Direction Method for Separable Convex Optimization Problems
In this article we present a version of the proximal alternating direction method for a convex problem with linear constraints and a separable objective function, in which the standard quadratic regularizing term is replaced with an interior proximal metric for those variables that are required to satisfy some additional convex constraints. Moreover, the proposed method has the advantage that t...
متن کاملGlobal convergence of an inexact interior-point method for convex quadratic symmetric cone programming
In this paper, we propose a feasible interior-point method for convex quadratic programming over symmetric cones. The proposed algorithm relaxes the accuracy requirements in the solution of the Newton equation system, by using an inexact Newton direction. Furthermore, we obtain an acceptable level of error in the inexact algorithm on convex quadratic symmetric cone programmin...
متن کاملA remark on accelerated block coordinate descent for computing the proximity operators of a sum of convex functions
We analyze alternating descent algorithms for minimizing the sum of a quadratic function and block separable non-smooth functions. In case the quadratic interactions between the blocks are pairwise, we show that the schemes can be accelerated, leading to improved convergence rates with respect to related accelerated parallel proximal descent. As an application we obtain very fast algorithms for...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM Journal on Optimization
دوره 19 شماره
صفحات -
تاریخ انتشار 2008