نتایج جستجو برای: conjugate gradient

تعداد نتایج: 163423  

2013
Mario Arioli

We combine linear algebra techniques with finite element techniques to obtain a reliable stopping cri-terion for Krylov method based algorithms. The Conjugate Gradient method has for a long time beensuccessfully used in the solution of the symmetric and positive definite systems obtained from thefinite-element approximation of self-adjoint elliptic partial differential equations...

2009
P. Thiruvalar Selvan S. Raghavan

In recent years, Computer Aided Design (CAD)based on Artificial Neural Networks (ANNs) have been introduced for microwave modeling simulation and optimization. In this paper, the characteristic parameters of Broadside Coupled Coplanar Waveguides (BSCCPWs) have been determined with the use of ANN model. Eight learning algorithms, Levenberg Marquart(LM), Bayesian Regularization (BR),Quasi–Newton ...

Journal: :CoRR 2017
Sen Na Mingyuan Ma Shuming Ma Guangju Peng

In this paper, we propose a couple of new Stochastic Strictly Contractive PeacemanRachford Splitting Method (SCPRSM), called Stochastic SCPRSM (SS-PRSM) and Stochastic Conjugate Gradient SCPRSM (SCG-PRSM) for large-scale optimization problems. The two types of Stochastic PRSM algorithms respectively incorporate stochastic variance reduced gradient (SVRG) and conjugate gradient method. Stochasti...

2016
Sahar Karimi Stephen Vavasis

Nesterov’s accelerated gradient method for minimizing a smooth strongly convex function f is known to reduce f(xk) − f(x∗) by a factor of ∈ (0, 1) after k ≥ O( √ L/` log(1/ )) iterations, where `, L are the two parameters of smooth strong convexity. Furthermore, it is known that this is the best possible complexity in the function-gradient oracle model of computation. The method of linear conju...

Journal: :SIAM Journal on Scientific Computing 2021

On the Convergence Rate of Variants Conjugate Gradient Algorithm in Finite Precision Arithmetic

Journal: :SIAM Journal on Optimization 2015
Philipp Hennig

This paper proposes a probabilistic framework for algorithms that iteratively solve unconstrained linear problems Bx = b with positive definite B for x. The goal is to replace the point estimates returned by existing methods with a Gaussian posterior belief over the elements of the inverse of B, which can be used to estimate errors. Recent probabilistic interpretations of the secant family of q...

1996
Sergey Fomel

This tutorial describes the classic method of conjugate directions: the generalization of the conjugate-gradient method in iterative least-square inversion. I derive the algebraic equations of the conjugate-direction method from general optimization principles. The derivation explains the “magic” properties of conjugate gradients. It also justifies the use of conjugate directions in cases when ...

1998
Paolo Campolucci Michele Simonetti Aurelio Uncini Francesco Piazza

In this paper we derive two second-order algorithms, based on conjugate gradient, for on-line training of recurrent neural networks. These azgorithms use two different techniques to extract second-order information on the Hessian matrix without calculating or storing it and without making numericaz approximations. Several simulation results for non-linear system identzjication tests by locally’...

2013
Xiaomin Duan Huafei Sun Z. N. ZHANG

A new framework based on the curved Riemannian manifold is proposed to calculate the numerical solution of the Lyapunov matrix equation by using a natural gradient descent algorithm and taking the geodesic distance as the objective function. Moreover, a gradient descent algorithm based on the classical Euclidean distance is provided to compare with this natural gradient descent algorithm. Furth...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید