نتایج جستجو برای: frank and wolfe method
تعداد نتایج: 17046428 فیلتر نتایج به سال:
New results on subgradient methods for strongly convex optimization problems with a unified analysis
We develop subgradientand gradient-based methods for minimizing strongly convex functions under a notion which generalizes the standard Euclidean strong convexity. We propose a unifying framework for subgradient methods which yields two kinds of methods, namely, the Proximal Gradient Method (PGM) and the Conditional Gradient Method (CGM), unifying several existing methods. The unifying framewor...
abstract part one: the electrode oxidation potentials of a series of eighteen n-hydroxy compounds in aqueous solution were calculated based on a proper thermodynamic cycle. the dft method at the level of b3lyp-6-31g(d,p) was used to calculate the gas-phase free energy differences ,and the polarizable continuum model (pcm) was applied to describe the solvent and its interaction with n-hydroxy ...
all analytical methods are generally based on the measurement of a parameter or parameters which are somehow related to the concentration of the species.an ideal analytical method is one in which the concentration of a species can be measured to a high degree precision and accuracy and with a high sensitivity. unfortunately finding such a method is very difficult or sometimes even impossible.in...
for several years, researchers in familiarity of efl teachers with post-method and its role in second and foreign language learners’ productions have pointed out that the opportunity to plan for a task generally develops language learners’ development (ellis, 2005). it is important to mention that the critical varies in language teaching was shown is the disappearances of the concept of method ...
There is renewed interest in formulating integration as a statistical inference problem, motivated by obtaining a full distribution over numerical error that can be propagated through subsequent computation. Current methods, such as Bayesian Quadrature, demonstrate impressive empirical performance but lack theoretical analysis. An important challenge is therefore to reconcile these probabilisti...
Point source localisation is generally modelled as a Lasso-type problem on measures. However, optimisation methods in non-Hilbert spaces, such the space of Radon measures, are much less developed than Hilbert spaces. Most numerical algorithms for point based Frank-Wolfe conditional gradient method, which ad hoc convergence theory developed. We develop extensions proximal-type to spaces This inc...
Recently, the Frank-Wolfe optimization algorithm was suggested as a procedure to obtain adaptive quadrature rules for integrals of functions in a reproducing kernel Hilbert space (RKHS) with a potentially faster rate of convergence than Monte Carlo integration (and “kernel herding” was shown to be a special case of this procedure). In this paper, we propose to replace the random sampling step i...
The famous Frank–Wolfe theorem ensures attainability of the optimal value for quadratic objective functions over a (possibly unbounded) polyhedron if the feasible values are bounded. This theorem does not hold in general for conic programs where linear constraints are replaced by more general convex constraints like positive-semidefiniteness or copositivity conditions, despite the fact that the...
The Frank-Wolfe optimization algorithm has recently regained popularity for machine learning applications due to its projection-free property and its ability to handle structured constraints. However, in the stochastic learning setting, it is still relatively understudied compared to the gradient descent counterpart. In this work, leveraging a recent variance reduction technique, we propose two...
The Frank-Wolfe method (a.k.a. conditional gradient algorithm) for smooth optimization has regained much interest in recent years in the context of large scale optimization and machine learning. A key advantage of the method is that it avoids projections the computational bottleneck in many applications replacing it by a linear optimization step. Despite this advantage, the known convergence ra...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید