نتایج جستجو برای: frank and wolfe method

تعداد نتایج: 17046428  

Journal: :Comp. Opt. and Appl. 2016
Masaru Ito

We develop subgradientand gradient-based methods for minimizing strongly convex functions under a notion which generalizes the standard Euclidean strong convexity. We propose a unifying framework for subgradient methods which yields two kinds of methods, namely, the Proximal Gradient Method (PGM) and the Conditional Gradient Method (CGM), unifying several existing methods. The unifying framewor...

پایان نامه :وزارت علوم، تحقیقات و فناوری - دانشگاه لرستان - دانشکده علوم پایه 1389

abstract part one: the electrode oxidation potentials of a series of eighteen n-hydroxy compounds in aqueous solution were calculated based on a proper thermodynamic cycle. the dft method at the level of b3lyp-6-31g(d,p) was used to calculate the gas-phase free energy differences ,and the polarizable continuum model (pcm) was applied to describe the solvent and its interaction with n-hydroxy ...

پایان نامه :0 1370

all analytical methods are generally based on the measurement of a parameter or parameters which are somehow related to the concentration of the species.an ideal analytical method is one in which the concentration of a species can be measured to a high degree precision and accuracy and with a high sensitivity. unfortunately finding such a method is very difficult or sometimes even impossible.in...

پایان نامه :وزارت علوم، تحقیقات و فناوری - دانشگاه تبریز - دانشکده ادبیات و زبانهای خارجی 1393

for several years, researchers in familiarity of efl teachers with post-method and its role in second and foreign language learners’ productions have pointed out that the opportunity to plan for a task generally develops language learners’ development (ellis, 2005). it is important to mention that the critical varies in language teaching was shown is the disappearances of the concept of method ...

2015
François-Xavier Briol Chris J. Oates Mark A. Girolami Michael A. Osborne

There is renewed interest in formulating integration as a statistical inference problem, motivated by obtaining a full distribution over numerical error that can be propagated through subsequent computation. Current methods, such as Bayesian Quadrature, demonstrate impressive empirical performance but lack theoretical analysis. An important challenge is therefore to reconcile these probabilisti...

Journal: :Journal of nonsmooth analysis and optimization 2023

Point source localisation is generally modelled as a Lasso-type problem on measures. However, optimisation methods in non-Hilbert spaces, such the space of Radon measures, are much less developed than Hilbert spaces. Most numerical algorithms for point based Frank-Wolfe conditional gradient method, which ad hoc convergence theory developed. We develop extensions proximal-type to spaces This inc...

Journal: :CoRR 2015
Simon Lacoste-Julien Fredrik Lindsten Francis R. Bach

Recently, the Frank-Wolfe optimization algorithm was suggested as a procedure to obtain adaptive quadrature rules for integrals of functions in a reproducing kernel Hilbert space (RKHS) with a potentially faster rate of convergence than Monte Carlo integration (and “kernel herding” was shown to be a special case of this procedure). In this paper, we propose to replace the random sampling step i...

Journal: :Math. Oper. Res. 2009
Werner Schachinger Immanuel M. Bomze

The famous Frank–Wolfe theorem ensures attainability of the optimal value for quadratic objective functions over a (possibly unbounded) polyhedron if the feasible values are bounded. This theorem does not hold in general for conic programs where linear constraints are replaced by more general convex constraints like positive-semidefiniteness or copositivity conditions, despite the fact that the...

2016
Elad Hazan Haipeng Luo

The Frank-Wolfe optimization algorithm has recently regained popularity for machine learning applications due to its projection-free property and its ability to handle structured constraints. However, in the stochastic learning setting, it is still relatively understudied compared to the gradient descent counterpart. In this work, leveraging a recent variance reduction technique, we propose two...

2015
Dan Garber Elad Hazan

The Frank-Wolfe method (a.k.a. conditional gradient algorithm) for smooth optimization has regained much interest in recent years in the context of large scale optimization and machine learning. A key advantage of the method is that it avoids projections the computational bottleneck in many applications replacing it by a linear optimization step. Despite this advantage, the known convergence ra...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید