نتایج جستجو برای: projected structured hessian update

تعداد نتایج: 231982  

2012
Sebastian U. Stich Christian L. Müller

We consider Covariance Matrix Adaptation schemes (CMA-ES [3], Gaussian Adaptation (GaA) [4]) and Randomized Hessian (RH) schemes from Leventhal and Lewis [5]. We provide a new, numerically stable implementation for RH and, in addition, combine the update with an adaptive step size strategy. We design a class of quadratic functions with parametrizable spectra to study the influence of the spectr...

Journal: :SIAM Journal on Matrix Analysis and Applications 2014

Journal: :Journal of the Medical Library Association : JMLA 2014

2012
Chris Hinrichs Vikas Singh Jiming Peng Sterling C. Johnson

Notice that in general, while the number of kernels may be large, it is unlikely that the size of Q (quadratic in the number of kernels, not examples), will dominate the combined size of all of the kernels. If so (as in a majority of computer vision problems), it may be advantageous to employ second-order methods to solve (1) for β in terms of w. Perhaps the best known of these second order met...

2014
ASRUL HERY BIN IBRAHIM MUSTAFA MAMAT

In this paper we present a new line search method known as the HBFGS method, which uses the search direction of the conjugate gradient method with the quasi-Newton updates. The Broyden-Fletcher-Goldfarb-Shanno (BFGS) update is used as approximation of the Hessian for the methods. The new algorithm is compared with the BFGS method in terms of iteration counts and CPU-time. Our numerical analysis...

2017
TOLGA ENSARİ

In this paper, we analyze character recognition performance of three different nonnegative matrix factorization (NMF) algorithms. These are multiplicative update (MU) rule known as standard NMF, alternating least square (NMF-ALS) and projected gradient descent (NMF-PGD). They are most preferred approaches in the literature. There are lots of application areas for NMF such as robotics, bioinform...

Journal: :CoRR 2016
Farbod Roosta-Khorasani Michael W. Mahoney

Large scale optimization problems are ubiquitous in machine learning and data analysis and there is a plethora of algorithms for solving such problems. Many of these algorithms employ sub-sampling, as a way to either speed up the computations and/or to implicitly implement a form of statistical regularization. In this paper, we consider second-order iterative optimization algorithms, i.e., thos...

Journal: :J. Computational Applied Mathematics 2011
Farzin Modarres Malik Abu Hassan Wah June Leong

Symmetric rank-one (SR1) is one of the competitive formulas among the quasi-Newton (QN) methods. In this paper, we propose some modified SR1 updates based on the modified secant equations, which use both gradient and function information. Furthermore, to avoid the loss of positive definiteness and zero denominators of the new SR1 updates, we apply a restart procedure to this update. Three new a...

2010
Ladislav Lukšan Jan Vlček

In this report we propose a new recursive matrix formulation of limited memory variable metric methods. This approach enables to approximate of both the Hessian matrix and its inverse and can be used for an arbitrary update from the Broyden class (and some other updates). The new recursive formulation requires approximately 4mn multiplications and additions for the direction determination, so i...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید