نتایج جستجو برای: mollifier subgradient

تعداد نتایج: 1200  

2011
N. MAHDAVI-AMIRI Francis Clarke

We present an effective algorithm for minimization of locally nonconvex Lipschitz functions based on mollifier functions approximating the Clarke generalized gradient. To this aim, first we approximate the Clarke generalized gradient by mollifier subgradients. To construct this approximation, we use a set of averaged functions gradients. Then, we show that the convex hull of this set serves as ...

Journal: :bulletin of the iranian mathematical society 2011
n. mahdavi-amiri r. yousefpour

Journal: :Proceedings of the National Academy of Sciences of the United States of America 1975
N Levinson

A mollifier played a key role in showing N(0)(T) > 1/3N(T) for large T in ref. 1 [Levinson, N. (1974) Advan. Math. 13, 383-436]. A basic problem in ref. 1 was that of obtaining an upper bound for a sum of two terms, one larger than the other. Here a deductive procedure is given for finding a mollifier that actually minimizes the larger term. An Euler-Lagrange equation is obtained. (Optimization...

2006
RYUICHI FUKUOKA

Let M be a differentiable manifold. We say that a tensor field g defined on M is non-regular if g is in some local L space or if g is continuous. In this work we define a mollifier smoothing gε of g that has the following feature: If g is a Riemannian metric of class C, then the Levi-Civita connection and the Riemannian curvature tensor of gε converges to the Levi-Civita connection and to the R...

Journal: :Annali di Matematica Pura ed Applicata (1923 -) 2020

Journal: :Math. Comput. 2016
Katy Craig Andrea L. Bertozzi

Abstract. Motivated by classical vortex blob methods for the Euler equations, we develop a numerical blob method for the aggregation equation. This provides a counterpoint to existing literature on particle methods. By regularizing the velocity field with a mollifier or “blob function”, the blob method has a faster rate of convergence and allows a wider range of admissible kernels. In fact, we ...

2011
Lingjie Weng Yutian Chen

Stochastic subgradient methods play an important role in machine learning. We introduced the concepts of subgradient methods and stochastic subgradient methods in this project, discussed their convergence conditions as well as the strong and weak points against their competitors. We demonstrated the application of (stochastic) subgradient methods to machine learning with a running example of tr...

In this paper, we present a new approach for solving absolute value equation (AVE) whichuse Levenberg-Marquardt method with conjugate subgradient structure. In conjugate subgradientmethods the new direction obtain by combining steepest descent direction and the previous di-rection which may not lead to good numerical results. Therefore, we replace the steepest descentdir...

Journal: :Astronomy and Astrophysics Supplement Series 1997

Journal: :Numerische Mathematik 2010
Fethallah Benmansour Guillaume Carlier Gabriel Peyré Filippo Santambrogio

This paper describes the Subgradient Marching algorithm to compute the derivative of the geodesic distance with respect to the metric. The geodesic distance being a concave function of the metric, this algorithm computes an element of the subgradient in O(N log(N)) operations on a discrete grid ofN points. It performs a front propagation that computes the subgradient of a discrete geodesic dist...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید