نتایج جستجو برای: mollifier subgradient

تعداد نتایج: 1200  

2012
M. Hairer M. D. Ryser H. Weber

We consider the stochastic Allen-Cahn equation driven by mollified space-time white noise. We show that, as the mollifier is removed, the solutions converge weakly to 0, independently of the initial condition. If the intensity of the noise simultaneously converges to 0 at a sufficiently fast rate, then the solutions converge to those of the deterministic equation. At the critical rate, the limi...

Journal: :Journal of Approximation Theory 1984

2017
Benjamin Grimmer

We present a subgradient method for minimizing non-smooth, non-Lipschitz convex optimization problems. The only structure assumed is that a strictly feasible point is known. We extend the work of Renegar [1] by taking a different perspective, leading to an algorithm which is conceptually more natural, has notably improved convergence rates, and for which the analysis is surprisingly simple. At ...

Journal: :Journal of Mathematical Analysis and Applications 1977

Journal: :Journal of Mathematical Analysis and Applications 1979

2006

The efficiency of the network flow techniques can be exploited in the solution of nonlinearly constrained network flow problems by means of approximate subgradient methods. In particular, we consider the case where the side constraints (non-network constraints) are convex. We propose to solve the dual problem by using &-subgradient methods given that the dual function is estimated by minimizing...

Journal: :Math. Program. 1999
Torbjörn Larsson Michael Patriksson Ann-Brith Strömberg

Lagrangean dualization and subgradient optimization techniques are frequently used within the field of computational optimization for finding approximate solutions to large, structured optimization problems. The dual subgradient scheme does not automatically produce primal feasible solutions; there is an abundance of techniques for computing such solutions (via penalty functions, tangential app...

2009
Alexander Segal

Simultaneous subgradient projection algorithms for the convex feasibility problem use subgradient calculations and converge sometimes even in the inconsistent case. We devise an algorithm that uses seminorm-induced oblique projections onto super half-spaces of the convex sets, which is advantageous when the subgradient-Jacobian is a sparse matrix at many iteration points of the algorithm. Using...

1995
Michael Patriksson

Subgradient methods are popular tools for nonsmooth, convex minimization , especially in the context of Lagrangean relaxation; their simplicity has been a main contribution to their success. As a consequence of the nonsmoothness, it is not straightforward to monitor the progress of a subgradient method in terms of the approximate fulllment of optimality conditions, since the subgradients used i...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید