نتایج جستجو برای: mollifier subgradient

تعداد نتایج: 1200  

Journal: :Journal of Approximation Theory 1982

Journal: :The Journal of the Australian Mathematical Society. Series B. Applied Mathematics 1999

2015
Qi Deng Guanghui Lan Anand Rangarajan

Block coordinate descent methods and stochastic subgradient methods have been extensively studied in optimization and machine learning. By combining randomized block sampling with stochastic subgradient methods based on dual averaging ([22, 36]), we present stochastic block dual averaging (SBDA)—a novel class of block subgradient methods for convex nonsmooth and stochastic optimization. SBDA re...

1998
M. PATRIKSSON

When nonsmooth, convex minimizationproblems are solved by subgradientoptimizationmethods, the subgradients used will in general not accumulate to subgradients which verify the optimal-ity of a solution obtained in the limit. It is therefore not a straightforward task to monitor the progress of a subgradient method in terms of the approximate fulllment of optimality conditions. Further, certain ...

Journal: :SIAM Journal on Optimization 2009
S. Sundhar Ram Angelia Nedic Venugopal V. Veeravalli

This paper studies the effect of stochastic errors on two constrained incremental subgradient algorithms. The incremental subgradient algorithms are viewed as decentralized network optimization algorithms as applied to minimize a sum of functions, when each component function is known only to a particular agent of a distributed network. First, the standard cyclic incremental subgradient algorit...

2007
Angelia Nedić Asuman Ozdaglar

We consider computing the saddle points of a convex-concave function using subgradient methods. The existing literature on finding saddle points has mainly focused on establishing convergence properties of the generated iterates under some restrictive assumptions. In this paper, we propose a subgradient algorithm for generating approximate saddle points and provide per-iteration convergence rat...

Journal: :Math. Comput. 2006
Jared Tanner

We discuss the reconstruction of piecewise smooth data from its (pseudo-) spectral information. Spectral projections enjoy superior resolution provided the function is globally smooth, while the presence of jump discontinuities is responsible for spurious O(1) Gibbs’ oscillations in the neighborhood of edges and an overall deterioration of the convergence rate to the unacceptable first order. C...

2007
Stephen Boyd Almir Mutapcic

3 Convergence proof 4 3.1 Assumptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 3.2 Some basic inequalities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 3.3 A bound on the suboptimality bound . . . . . . . . . . . . . . . . . . . . . . 7 3.4 A stopping criterion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 3.5 Numerical examp...

Journal: :Math. Meth. of OR 2005
Jian-Wen Peng H. W. Joseph Lee Wei Dong Rong Xinmin Yang

Some new results which generalize the Hahn-Banach theorem from scalar or vector-valued case to set-valued case are obtained. The existence of the Borwein-strong subgradient and Yang-weak subgradient for set-valued maps are also proven. we present a new Lagrange multiplier theorem and a new Sandwich theorem for set-valued maps.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید