نتایج جستجو برای: newton quasi
تعداد نتایج: 102092 فیلتر نتایج به سال:
This paper develops and analyzes a generalization of the Broyden class of quasiNewton methods to the problem of minimizing a smooth objective function f on a Riemannian manifold. A condition on vector transport and retraction that guarantees convergence and facilitates efficient computation is derived. Experimental evidence is presented demonstrating the value of the extension to the Riemannian...
This paper is dedicated to Claude Lemaréchal on the occasion of his 65th birthday. We take this opportunity to thank him deeply for the great moments we have had discussing with him (not only about math). His vision and his ability to put ideas into words has helped us deepen our understanding of optimization. This work builds on one of his lines of research: using convex analysis and nonlinear...
In this paper, we present a quasi-Newton (QN) algorithm for joint independent subspace analysis (JISA). JISA is a recently proposed generalization of independent vector analysis (IVA). JISA extends classical blind source separation (BSS) to jointly resolve several BSS problems by exploiting statistical dependence between latent sources across mixtures, as well as relaxing the assumption of stat...
We introduce a general criterion for blindly extracting a subset of sources in instantaneous mixtures. We derive the corresponding estimation equations and generalize them based on arbitrary nonlinear separating functions. A quasi-Newton algorithm for minimizing the criterion is presented, which reduces to the FastICA algorithm in the case when only one source is extracted. The asymptotic distr...
In this paper we study new preconditioners to be used within the Nonlinear Conjugate Gradient (NCG) method, for large scale unconstrained optimization. The rationale behind our proposal draws inspiration from quasi– Newton updates, and its aim is to possibly approximate in some sense the inverse of the Hessian matrix. In particular, at the current iteration of the NCG we consider some precondit...
The BFGS method is one of the most famous quasi-Newton algorithms for unconstrained optimization. In 1984, Powell presented an example of a function of two variables that shows that the Polak–Ribière–Polyak (PRP) conjugate gradient method and the BFGS quasi-Newton method may cycle around eight nonstationary points if each line search picks a local minimum that provides a reduction in the object...
In this paper we present a modified BFGS algorithm for unconstrained optimization. The BFGS algorithm updates an approximate Hessian which satisfies the most recent quasi-Newton equation. The quasi-Newton condition can be interpreted as the interpolation condition that the gradient value of the local quadratic model matches that of the objective function at the previous iterate. Our modified al...
We show how the quasi-Newton least squares method (QN-LS) relates to Krylov subspace methods in general and to GMRes in particular.
Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opin...
We study Newton’s method and method based on linearization for solving quasi-variational inequalities in a finite-dimensional real vector space. Projection methods were the most studied methods for solving quasi-variational inequalities and they have linear rates of the convergence. In the paper we establish sufficient conditions for the convergence of Newton’s method and method of linearizatio...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید