A proximal method for composite minimization
نویسندگان
چکیده
We consider minimization of functions that are compositions of prox-regular functions with smooth vector functions. A wide variety of important optimization problems can be formulated in this way. We describe a subproblem constructed from a linearized approximation to the objective and a regularization term, investigating the properties of local solutions of this subproblem and showing that they eventually identify a manifold containing the solution of the original problem. We propose an algorithmic framework based on this subproblem and prove a global convergence result.
منابع مشابه
An Accelerated Proximal Coordinate Gradient Method
We develop an accelerated randomized proximal coordinate gradient (APCG) method, for solving a broad class of composite convex optimization problems. In particular, our method achieves faster linear convergence rates for minimizing strongly convex functions than existing randomized proximal coordinate gradient methods. We show how to apply the APCG method to solve the dual of the regularized em...
متن کاملGeometric Descent Method for Convex Composite Minimization
In this paper, we extend the geometric descent method recently proposed by Bubeck, Lee and Singh [5] to solving nonsmooth and strongly convex composite problems. We prove that the resulting algorithm, GeoPG, converges with a linear rate (1− 1/√κ), thus achieves the optimal rate among first-order methods, where κ is the condition number of the problem. Numerical results on linear regression and ...
متن کاملEfficient k-Support-Norm Regularized Minimization via Fully Corrective Frank-Wolfe Method
The k-support-norm regularized minimization has recently been applied with success to sparse prediction problems. The proximal gradient method is conventionally used to minimize this composite model. However it tends to suffer from expensive iteration cost thus the model solving could be time consuming. In our work, we reformulate the k-support-norm regularized formulation into a constrained fo...
متن کاملA proximal Newton framework for composite minimization: Graph learning without Cholesky decompositions and matrix inversions
We propose an algorithmic framework for convex minimization problems of composite functions with two terms: a self-concordant part and a possibly nonsmooth regularization part. Our method is a new proximal Newton algorithm with local quadratic convergence rate. As a specific problem instance, we consider sparse precision matrix estimation problems in graph learning. Via a careful dual formulati...
متن کاملRandomized block proximal damped Newton method for composite self-concordant minimization
In this paper we consider the composite self-concordant (CSC) minimization problem, which minimizes the sum of a self-concordant function f and a (possibly nonsmooth) proper closed convex function g. The CSC minimization is the cornerstone of the path-following interior point methods for solving a broad class of convex optimization problems. It has also found numerous applications in machine le...
متن کاملAn Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
We consider the problem of minimizing the sum of two convex functions: one is smooth and given by a gradient oracle, and the other is separable over blocks of coordinates and has a simple known structure over each block. We develop an accelerated randomized proximal coordinate gradient (APCG) method for minimizing such convex composite functions. For strongly convex functions, our method achiev...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Math. Program.
دوره 158 شماره
صفحات -
تاریخ انتشار 2016