A Golden Ratio Primal–Dual Algorithm for Structured Convex Optimization

نویسندگان

چکیده

We design, analyze and test a golden ratio primal–dual algorithm (GRPDA) for solving structured convex optimization problem, where the objective function is sum of two closed proper functions, one which involves composition with linear transform. GRPDA preserves all favorable features classical (PDA), i.e., primal dual variables are updated in Gauss–Seidel manner, per iteration cost dominated by evaluation proximal point mappings component functions matrix-vector multiplications. Compared PDA, takes an extrapolation step, novelty that it constructed based on combination essentially whole trajectory. show converges within broader range parameters than provided reciprocal parameter bounded above ratio, explains name algorithm. An $$\mathcal {O}(1/N)$$ ergodic convergence rate result also established gap function, N denotes number iterations. When either or problem strongly convex, accelerated to improve from {O}(1/N^2)$$ . Moreover, we regularized least-squares equality constrained problems can be extended 2 meanwhile relaxation step taken. Our preliminary numerical results LASSO, nonnegative minimax matrix game problems, comparisons some state-of-the-art relative algorithms, demonstrate efficiency proposed algorithms.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Golden Ratio Parameterized Algorithm for Cluster Editing

The Cluster Editing problem asks to transform a graph by at most k edge modifications into a disjoint union of cliques. The problem is NP-complete, but several parameterized algorithms are known. We present a novel search tree algorithm for the problem, which improves running time from O(1.76 + m + n) to O(1.62 + m + n) for m edges and n vertices. In detail, we can show that we can always branc...

متن کامل

An Interior Point Algorithm for Solving Convex Quadratic Semidefinite Optimization Problems Using a New Kernel Function

In this paper, we consider convex quadratic semidefinite optimization problems and provide a primal-dual Interior Point Method (IPM) based on a new kernel function with a trigonometric barrier term. Iteration complexity of the algorithm is analyzed using some easy to check and mild conditions. Although our proposed kernel function is neither a Self-Regular (SR) fun...

متن کامل

Accelerated gradient sliding for structured convex optimization

Our main goal in this paper is to show that one can skip gradient computations for gradient descent type methods applied to certain structured convex programming (CP) problems. To this end, we first present an accelerated gradient sliding (AGS) method for minimizing the summation of two smooth convex functions with different Lipschitz constants. We show that the AGS method can skip the gradient...

متن کامل

Structured Sparsity and Convex Optimization

The concept of parsimony is central in many scientific domains. In the context of statistics, signal processing or machine learning, it takes the form of variable or feature selection problems, and is commonly used in two situations: First, to make the model or the prediction more interpretable or cheaper to use, i.e., even if the underlying problem does not admit sparse solutions, one looks fo...

متن کامل

Structured sparsity through convex optimization

Sparse estimation methods are aimed at using or obtaining parsimonious representations of data or models. While naturally cast as a combinatorial optimization problem, variable or feature selection admits a convex relaxation through the regularization by the l1-norm. In this paper, we consider situations where we are not only interested in sparsity, but where some structural prior knowledge is ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Scientific Computing

سال: 2021

ISSN: ['1573-7691', '0885-7474']

DOI: https://doi.org/10.1007/s10915-021-01452-9