A Random Coordinate Descent Algorithm for Singly Linear Constrained Smooth Optimization∗

نویسنده

  • I. NECOARA
چکیده

In this paper we develop a novel randomized block-coordinate descent method for minimizing multi-agent convex optimization problems with singly linear coupled constraints over networks and prove that it obtains in expectation an ε accurate solution in at most O( 1 λ2(Q)ε ) iterations, where λ2(Q) is the second smallest eigenvalue of a matrix Q that is defined in terms of the probabilities and the number of blocks. However, the computational complexity per iteration of our method is much simpler than of a method based on full gradient information and each iteration can be computed in a completely distributed way. We focus on how to choose the probabilities to make this randomized algorithm to converge as fast as possible and we arrive at solving a sparse SDP. Numerical tests confirm that on huge optimization problems our method is much more numerically efficient than methods based on full gradient.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Random Block Coordinate Descent Methods for Linearly Constrained Optimization over Networks

In this paper we develop random block coordinate descent methods for minimizing large-scale linearly constrained convex problems over networks. Since coupled constraints appear in the problem, we devise an algorithm that updates in parallel at each iteration at least two random components of the solution, chosen according to a given probability distribution. Those computations can be performed ...

متن کامل

A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints

In this paper we propose a variant of the random coordinate descent method for solving linearly constrained convex optimization problems with composite objective functions. If the smooth part of the objective function has Lipschitz continuous gradient, then we prove that our method obtains an ε-optimal solution in O(N/ε) iterations, where N is the number of blocks. For the class of problems wit...

متن کامل

Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization

In this paper we analyze several new methods for solving nonconvex optimization problems with the objective function formed as a sum of two terms: one is nonconvex and smooth, and another is convex but simple and its structure is known. Further, we consider both cases: unconstrained and linearly constrained nonconvex problems. For optimization problems of the above structure, we propose random ...

متن کامل

An Asynchronous Parallel Stochastic Coordinate Descent Algorithm

We describe an asynchronous parallel stochastic coordinate descent algorithm for minimizing smooth unconstrained or separably constrained functions. The method achieves a linear convergence rate on functions that satisfy an essential strong convexity property and a sublinear rate (1/K) on general convex functions. Near-linear speedup on a multicore system can be expected if the number of proces...

متن کامل

Large-scale randomized-coordinate descent methods with non-separable linear constraints

We develop randomized block coordinate descent (CD) methods for linearly constrained convex optimization. Unlike other large-scale CD methods, we do not assume the constraints to be separable, but allow them be coupled linearly. To our knowledge, ours is the first CD method that allows linear coupling constraints, without making the global iteration complexity have an exponential dependence on ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012