Large-scale randomized-coordinate descent methods with non-separable linear constraints

نویسندگان

  • Sashank J. Reddi
  • Ahmed Hefny
  • Carlton Downey
  • Avinava Dubey
  • Suvrit Sra
چکیده

We develop randomized block coordinate descent (CD) methods for linearly constrained convex optimization. Unlike other large-scale CD methods, we do not assume the constraints to be separable, but allow them be coupled linearly. To our knowledge, ours is the first CD method that allows linear coupling constraints, without making the global iteration complexity have an exponential dependence on the number of constraints. We present algorithms and theoretical analysis for four key (convex) scenarios: (i) smooth; (ii) smooth + separable nonsmooth; (iii) asynchronous parallel; and (iv) stochastic. We discuss some architectural details of our methods and present preliminary results to illustrate the behavior of our algorithms.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Parallel Direction Method of Multipliers

We consider the problem of minimizing block-separable convex functions subject to linear constraints. While the Alternating Direction Method of Multipliers (ADMM) for two-block linear constraints has been intensively studied both theoretically and empirically, in spite of some preliminary work, effective generalizations of ADMM to multiple blocks is still unclear. In this paper, we propose a pa...

متن کامل

A Random Coordinate Descent Method on Large-scale Optimization Problems with Linear Constraints

In this paper we develop a random block coordinate descent method for minimizing large-scale convex problems with linearly coupled constraints and prove that it obtains in expectation an ε-accurate solution in at most O( 1 ε ) iterations. However, the numerical complexity per iteration of the new method is usually much cheaper than that of methods based on full gradient information. We focus on...

متن کامل

Stochastic Parallel Block Coordinate Descent for Large-Scale Saddle Point Problems

We consider convex-concave saddle point problems with a separable structure and non-strongly convex functions. We propose an efficient stochastic block coordinate descent method using adaptive primal-dual updates, which enables flexible parallel optimization for large-scale problems. Our method shares the efficiency and flexibility of block coordinate descent methods with the simplicity of prim...

متن کامل

Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition

I Simple proof of linear convergence. I For convex functions, equivalent to several of the above conditions. I For non-convex functions, weakest assumption while still guaranteeing global minimizer. ? We generalize the PL condition to analyze proximal-gradient methods. ? We give simple new analyses in a variety of settings: I Least-squares and logistic regression. I Randomized coordinate descen...

متن کامل

An Efficient Inexact ABCD Method for Least Squares Semidefinite Programming

We consider least squares semidefinite programming (LSSDP) where the primal matrix variable must satisfy given linear equality and inequality constraints, and must also lie in the intersection of the cone of symmetric positive semidefinite matrices and a simple polyhedral set. We propose an inexact accelerated block coordinate descent (ABCD) method for solving LSSDP via its dual, which can be r...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015