نتایج جستجو برای: m convex function
تعداد نتایج: 1717049 فیلتر نتایج به سال:
In this paper we provide a method to find global minimizers of certain non-convex 2-phase image segmentation problems. This is achieved by formulating a convex minimization problem whose minimizers are also minimizers of the initial non-convex segmentation problem, similar to the approach proposed by Nikolova, Esedoḡlu and Chan. The key difference to the latter model is that the new model does ...
Let (M,ω) be a Kähler manifold. An integrable function φ on M is called ω-plurisubharmonic if the current ddφ ∧ ω is positive. We prove that φ is ωplurisubharmonic if and only if φ is subharmonic on all q-dimensional complex subvarieties. We prove that a ωplurisubharmonic function is q-convex, and admits a local approximation by smooth, ω-plurisubharmonic functions. For any closed subvariety Z ...
in this paper we study the concept of latin-majorizati-on. geometrically this concept is different from other kinds of majorization in some aspects. since the set of all $x$s latin-majorized by a fixed $y$ is not convex, but, consists of :union: of finitely many convex sets. next, we hint to linear preservers of latin-majorization on $ mathbb{r}^{n}$ and ${m_{n,m}}$.
Abstract Integrally convex functions constitute a fundamental function class in discrete analysis, including M-convex functions, L-convex and many others. This paper aims at rather comprehensive survey of recent results on integrally with some new technical results. Topics covered this include characterizations integral sets operations optimality criteria for minimization proximity-scaling algo...
A popular approach for estimating an unknown signal x0 ∈ R from noisy, linear measurements y = Ax0 +z ∈ R is via solving a so called regularized M-estimator: x̂ := arg minx L(y−Ax)+λf(x). Here, L is a convex loss function, f is a convex (typically, non-smooth) regularizer, and, λ > 0 is a regularizer parameter. We analyze the squared error performance ‖x̂ − x0‖2 of such estimators in the high-dim...
in this paper, we consider convex quadratic semidefinite optimization problems and provide a primal-dual interior point method (ipm) based on a new kernel function with a trigonometric barrier term. iteration complexity of the algorithm is analyzed using some easy to check and mild conditions. although our proposed kernel function is neither a self-regular (sr) function nor logarithmic barrier ...
In this paper, we describe and establish iteration-complexity of two accelerated composite gradient (ACG) variants to solve a smooth nonconvex optimization problem whose objective function is the sum differentiable f with Lipschitz continuous simple nonsmooth closed convex h. When convex, first ACG variant reduces well-known FISTA for specific choice input, hence one can be viewed as natural ex...
0 Introduction For n 2, Let M n be a nite convex, not necessarily smooth, hypersur-face in Euclidean space R n+1 containing the origin. More precisely, M n is the boundary of some convex domain in R n+1 containing a neighborhood of the origin. We write M n = fR(x) = (x)x j x 2 S n g, where is a function from S n to R +. Let : M n ! S n denote the generalized Gauss map, namely, (Y) is the set of...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید