نتایج جستجو برای: nonsmooth convex optimization problem

تعداد نتایج: 1134849  

Journal: :Boletim da Sociedade Paranaense de Matemática 2022

Optimal correction of an infeasible equations system as Ax + B|x|= b leads into a non-convex fractional problem. In this paper, regularization method(ℓp-norm, 0 < p 1), is presented to solve mentioned method, the obtained problem can be formulated and nonsmooth optimization which not Lipschitz. The objective function decomposed difference convex functions (DC). For reason, we use special smo...

2004
W. L. Hare A. S. Lewis

Active set algorithms, such as the projected gradient method in nonlinear optimization, are designed to “identify” the active constraints of the problem in a finite number of iterations. Using the notions of “partial smoothness” and “prox-regularity” we extend work of Burke, Moré and Wright on identifiable surfaces from the convex case to a general nonsmooth setting. We further show how this se...

Journal: :Math. Program. 2011
Sangwoon Yun Paul Tseng Kim-Chuan Toh

We consider a class of unconstrained nonsmooth convex optimization problems, in which the objective function is the sum of a convex smooth function on an open subset of matrices and a separable convex function on a set of matrices. This problem includes the covariance selection estimation problem that can be expressed as an `1-penalized maximum likelihood estimation problem. In this paper, we p...

2012
Ian En-Hsu Yen Nanyun Peng Po-Wei Wang

Concave-Convex Procedure (CCCP) has been widely used to solve nonconvex d.c.(difference of convex function) programs occur in learning problems, such as sparse support vector machine (SVM), transductive SVM, sparse principal componenent analysis (PCA), etc. Although the global convergence behavior of CCCP has been well studied, the convergence rate of CCCP is still an open problem. Most of d.c....

Journal: :Math. Program. 2016
Guanghui Lan

We consider in this paper a class of composite optimization problems whose objective function is given by the summation of a general smooth and nonsmooth component, together with a relatively simple nonsmooth term. We present a new class of first-order methods, namely the gradient sliding algorithms, which can skip the computation of the gradient for the smooth component from time to time. As a...

2015
Masoud Ahookhosh Arnold Neumaier

This paper describes an algorithm for solving structured nonsmooth convex optimization problems using the optimal subgradient algorithm (OSGA), which is a first-order method with the complexity O(ε) for Lipschitz continuous nonsmooth problems and O(ε) for smooth problems with Lipschitz continuous gradient. If the nonsmoothness of the problem is manifested in a structured way, we reformulate the...

2016
Canyi Lu Huan Li Zhouchen Lin Shuicheng Yan

The Augmented Lagragian Method (ALM) and Alternating Direction Method of Multiplier (ADMM) have been powerful optimization methods for general convex programming subject to linear constraint. We consider the convex problem whose objective consists of a smooth part and a nonsmooth but simple part. We propose the Fast Proximal Augmented Lagragian Method (Fast PALM) which achieves the convergence ...

2004
Fanwen Meng Gongyun Zhao

In this paper, we attempt to investigate a class of constrained nonsmooth convex optimization problems, that is, piecewise C2 convex objectives with smooth convex inequality constraints. By using the Moreau-Yosida regularization, we convert these problems into unconstrained smooth convex programs. Then, we investigate the second-order properties of the Moreau-Yosida regularization η. By introdu...

Journal: :Siam Journal on Optimization 2021

This paper presents a proximal bundle variant, namely, the relaxed (RPB) method, for solving convex nonsmooth composite optimization problems. Like other variants, R...

Mohammad Bagher Menhaj Tahereh Esmaeili Abharian

Knowing the fact that the main weakness of the most standard methods including k-means and hierarchical data clustering is their sensitivity to initialization and trapping to local minima, this paper proposes a modification of convex data clustering  in which there is no need to  be peculiar about how to select initial values. Due to properly converting the task of optimization to an equivalent...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید