Inexact Coordinate Descent: Complexity and Preconditioning
نویسندگان
چکیده
In this paper we consider the problem of minimizing a convex function using a randomized block coordinate descent method. One of the key steps at each iteration of the algorithm is determining the update to a block of variables. Existing algorithms assume that in order to compute the update, a particular subproblem is solved exactly. In his work we relax this requirement, and allow for the subproblem to be solved inexactly, leading to an inexact block coordinate descent method. Our approach incorporates the best known results for exact updates as a special case. Moreover, these theoretical guarantees are complemented by practical considerations: the use of iterative techniques to determine the update as well as the use of preconditioning for further acceleration.
منابع مشابه
A Comparison of Inexact Newton and Coordinate Descent Mesh Optimization Techniques
We compare inexact Newton and coordinate descent methods for optimizing the quality of a mesh by repositioning the vertices, where quality is measured by the harmonic mean of the mean-ratio metric. The effects of problem size, element size heterogeneity, and various vertex displacement schemes on the performance of these algorithms are assessed for a series of tetrahedral meshes.
متن کاملComplexity of Inexact Proximal Newton methods
Recently several, so-called, proximal Newton methods were proposed for sparse optimization [6, 11, 8, 3]. These methods construct a composite quadratic approximation using Hessian information, optimize this approximation using a first-order method, such as coordinate descent and employ a line search to ensure sufficient descent. Here we propose a general framework, which includes slightly modif...
متن کاملPractical inexact proximal quasi-Newton method with global complexity analysis
Recently several methods were proposed for sparse optimization which make careful use of second-order information [11, 34, 20, 4] to improve local convergence rates. These methods construct a composite quadratic approximation using Hessian information, optimize this approximation using a first-order method, such as coordinate descent and employ a line search to ensure sufficient descent. Here w...
متن کاملRandomized Similar Triangles Method: A Unifying Framework for Accelerated Randomized Optimization Methods (Coordinate Descent, Directional Search, Derivative-Free Method)
In this paper, we consider smooth convex optimization problems with simple constraints and inexactness in the oracle information such as value, partial or directional derivatives of the objective function. We introduce a unifying framework, which allows to construct different types of accelerated randomized methods for such problems and to prove convergence rate theorems for them. We focus on a...
متن کاملLow Complexity and High speed in Leading DCD ERLS Algorithm
Adaptive algorithms lead to adjust the system coefficients based on the measured data. This paper presents a dichotomous coordinate descent method to reduce the computational complexity and to improve the tracking ability based on the variable forgetting factor when there are a lot of changes in the system. Vedic mathematics is used to implement the multiplier and the divider in the VFF equatio...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- J. Optimization Theory and Applications
دوره 170 شماره
صفحات -
تاریخ انتشار 2016