A proximal-based decomposition method for convex minimization problems
نویسندگان
چکیده
This paper presents a decomposition method for solving convex minimization problems. At each iteration, the algorithm computes two proximal steps in the dual variables and one proximal step in the primal variables. We derive this algorithm from Rockafellar's proximal method of multipliers, which involves an augmented Lagrangian with an additional quadratic proximal term. The algorithm preserves the good features of the proximal method of multipliers, with the additional advantage that it leads to a decoupling of the constraints, and is thus suitable for parallel implementation. We allow for computing approximately the proximal minimization steps and we prove that under mild assumptions on the problem's data, the method is globally convergent and at a linear rate. The method is compared with alternating direction type methods and applied to the particular case of minimizing a convex function over a finite intersection of closed convex sets. AMS Subject Classification: 90C25.
منابع مشابه
On the Proximal Jacobian Decomposition of ALM for Multiple-Block Separable Convex Minimization Problems and Its Relationship to ADMM
The augmented Lagrangian method (ALM) is a benchmark for solving convex minimization problems with linear constraints. When the objective function of the model under consideration is representable as the sum of some functions without coupled variables, a Jacobian or Gauss-Seidel decomposition is often implemented to decompose the ALM subproblems so that the functions’ properties could be used m...
متن کاملA Decomposition Method Using Bregman Distances to Convex Separable minimization Problems
In this paper we propose an extension of the proximal decomposition algorithm using Bregman distances to solve convex separable minimization problems. Under some standard assumptions it is proved that the iterations generated by the algorithm are well defined and some convergence results are obtained.
متن کاملDecomposition Techniques for Bilinear Saddle Point Problems and Variational Inequalities with Affine Monotone Operators
The majority of First Order methods for large-scale convex-concave saddle point problems and variational inequalities with monotone operators are proximal algorithms which at every iteration need to minimize over problem’s domain X the sum of a linear form and a strongly convex function. To make such an algorithm practical, X should be proximal-friendly – admit a strongly convex function with e...
متن کاملIteration-complexity of block-decomposition algorithms and the alternating minimization augmented Lagrangian method
In this paper, we consider the monotone inclusion problem consisting of the sum of a continuous monotone map and a point-to-set maximal monotone operator with a separable two-block structure and introduce a framework of block-decomposition prox-type algorithms for solving it which allows for each one of the single-block proximal subproblems to be solved in an approximate sense. Moreover, by sho...
متن کاملA proximal Newton framework for composite minimization: Graph learning without Cholesky decompositions and matrix inversions
We propose an algorithmic framework for convex minimization problems of composite functions with two terms: a self-concordant part and a possibly nonsmooth regularization part. Our method is a new proximal Newton algorithm with local quadratic convergence rate. As a specific problem instance, we consider sparse precision matrix estimation problems in graph learning. Via a careful dual formulati...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Math. Program.
دوره 64 شماره
صفحات -
تاریخ انتشار 1994