A new Levenberg-Marquardt approach based on Conjugate gradient structure for solving absolute value equations
نویسندگان
چکیده مقاله:
In this paper, we present a new approach for solving absolute value equation (AVE) whichuse Levenberg-Marquardt method with conjugate subgradient structure. In conjugate subgradientmethods the new direction obtain by combining steepest descent direction and the previous di-rection which may not lead to good numerical results. Therefore, we replace the steepest descentdirection by the Levenberg-Marquardt direction. The descent property of the direction generatedby new algorithm in each iteration is established. Also, the global convergence of such a methodare established under some mild assumptions. Some numerical results are reported.In this paper, we present a new approach for solving absolute value equation (AVE) whichuse Levenberg-Marquardt method with conjugate subgradient structure. In conjugate subgradientmethods the new direction obtain by combining steepest descent direction and the previous di-rection which may not lead to good numerical results. Therefore, we replace the steepest descentdirection by the Levenberg-Marquardt direction. The descent property of the direction generatedby new algorithm in each iteration is established. Also, the global convergence of such a methodare established under some mild assumptions. Some numerical results are reported.
منابع مشابه
A Three-terms Conjugate Gradient Algorithm for Solving Large-Scale Systems of Nonlinear Equations
Nonlinear conjugate gradient method is well known in solving large-scale unconstrained optimization problems due to it’s low storage requirement and simple to implement. Research activities on it’s application to handle higher dimensional systems of nonlinear equations are just beginning. This paper presents a Threeterm Conjugate Gradient algorithm for solving Large-Scale systems of nonlinear e...
متن کاملA New Cuckoo Search Based Levenberg-Marquardt (CSLM) Algorithm
Back propagation neural network (BPNN) algorithm is a widely used technique in training artificial neural networks. It is also a very popular optimization procedure applied to find optimal weights in a training process. However, traditional back propagation optimized with Levenberg marquardt training algorithm has some drawbacks such as getting stuck in local minima, and network stagnancy. This...
متن کاملOn Levenberg-marquardt-kaczmarz Iterative Methods for Solving Systems of Nonlinear Ill-posed Equations
In this article a modified Levenberg-Marquardt method coupled with a Kaczmarz strategy for obtaining stable solutions of nonlinear systems of ill-posed operator equations is investigated. We show that the proposed method is a convergent regularization method. Numerical tests are presented for a non-linear inverse doping problem based on a bipolar model.
متن کاملA Parameter-self-adjusting Levenberg-marquardt Method for Solving Nonsmooth Equations
A parameter-self-adjusting Levenberg-Marquardt method (PSA-LMM) is proposed for solving a nonlinear system of equations F (x) = 0, where F : R → R is a semismooth mapping. At each iteration, the LM parameter μk is automatically adjusted based on the ratio between actual reduction and predicted reduction. The global convergence of PSALMM for solving semismooth equations is demonstrated. Under th...
متن کاملA New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems
In this paper, two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS}) conjugate gradient method are presented to solve unconstrained optimization problems. A remarkable property of the proposed methods is that the search direction always satisfies the sufficient descent condition independent of line search method, based on eigenvalue analysis. The globa...
متن کاملTwo Cscs-based Iteration Methods for Solving Absolute Value Equations∗
Recently, two families of HSS-based iteration methods are constructed for solving the system of absolute value equations (AVEs), which is a class of non-differentiable NP-hard problems. In this study, we establish the Picard-CSCS iteration method and the nonlinear CSCS-like iteration method for AVEs involving the Toeplitz matrix. Then, we analyze the convergence of the Picard-CSCS iteration met...
متن کاملمنابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ذخیره در منابع من قبلا به منابع من ذحیره شده{@ msg_add @}
عنوان ژورنال
دوره 5 شماره 21
صفحات 5- 14
تاریخ انتشار 2019-12-22
با دنبال کردن یک ژورنال هنگامی که شماره جدید این ژورنال منتشر می شود به شما از طریق ایمیل اطلاع داده می شود.
میزبانی شده توسط پلتفرم ابری doprax.com
copyright © 2015-2023