نتایج جستجو برای: locally nonconvex lipschitz function
تعداد نتایج: 1291344 فیلتر نتایج به سال:
We describe an extension of the classical cutting plane algorithm to tackle the unconstrained minimization of a nonconvex, not necessarily differentiable function of several variables. The method is based on the construction of both a lower and an upper polyhedral approximation to the objective function and it is related to the use of the concept of proximal trajectory. Convergence to a station...
We present an effective algorithm for minimization of locally nonconvex Lipschitz functions based on mollifier functions approximating the Clarke generalized gradient. To this aim, first we approximate the Clarke generalized gradient by mollifier subgradients. To construct this approximation, we use a set of averaged functions gradients. Then, we show that the convex hull of this set serves as ...
The proximal point mapping is the basis of many optimization techniques for convex functions. By means of variational analysis, the concept of proximal mapping was recently extended to nonconvex functions that are prox-regular and prox-bounded. In such a setting, the proximal point mapping is locally Lipschitz continuous and its set of fixed points coincide with the critical points of the origi...
We propose an algorithm for solving nonsmooth, nonconvex, constrained optimization problems as well as a new set of visualization tools for comparing the performance of optimization algorithms. Our algorithm is a sequential quadratic optimization method that employs BFGS quasi-Newton Hessian approximations and an exact penalty function whose parameter is controlled using a steering strategy. We...
In this paper a new algorithm to locally minimize nonsmooth, nonconvex functions is developed. We introduce the notion of secants and quasisecants for nonsmooth functions. The quasisecants are applied to find descent directions of locally Lipschitz functions. We design a minimization algorithm which uses quasisecants to find descent directions. We prove that this algorithm converges to Clarke s...
Here, 2 ≤ p < ∞, j : Z × R → R is a function which is measurable in z ∈ Z and locally Lipschitz in x ∈ R and ∂ j(z,x) is the Clarke subdifferential of j(z, ·). If f : Z × R → R is a measurable function which is in general discontinuous in the x ∈ R variable, for almost all z ∈ Z, all M > 0, and all |x| ≤ M, we have | f (z,x)| ≤ aM(z) with aM ∈ L1(Z) and we set j(z,x) = ∫x 0 f (z, r)dr, then j(z...
We present a new proximity control bundle algorithm to minimize nonsmooth and nonconvex locally Lipschitz functions. In contrast with the traditional oracle-based methods in nonsmooth programming, our method is model-based and can accommodate cases where several Clarke subgradients can be computed at reasonable cost. We propose a new way to manage the proximity control parameter, which allows u...
Lexicographic derivatives developed by Nesterov and directed subdifferentials developed by Baier, Farkhi, and Roshchina are both essentially nonconvex generalized derivatives for nonsmooth nonconvex functions and satisfy strict calculus rules and mean-value theorems. This article aims to clarify the relationship between the two generalized derivatives. In particular, for scalar-valued functions...
Abstract. In this paper, we propose a smoothing sequential quadratic programming (SSQP) algorithm for solving a class of nonsmooth nonconvex, perhaps even non-Lipschitz minimization problems, which has wide applications in statistics and sparse reconstruction. At each step, the SSQP algorithm solves a strongly convex quadratic minimization problem with a diagonal Hessian matrix, which has a sim...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید