A modified Liu-Storey-Conjugate descent hybrid projection method for convex constrained nonlinear equations and image restoration

نویسندگان

چکیده

<p style='text-indent:20px;'>We present an iterative method for solving the convex constraint nonlinear equation problem. The incorporates projection strategy by Solodov and Svaiter with hybrid Liu-Storey Conjugate descent Yang et al. unconstrained optimization proposed does not require Jacobian information, nor it to store any matrix at each iteration. Thus, has potential solve large-scale non-smooth problems. Under some standard assumptions, convergence analysis of is established. Finally, show applicability method, used <inline-formula><tex-math id="M1">\begin{document}$ \ell_1 $\end{document}</tex-math></inline-formula>-norm regularized problems restore blurred noisy images. numerical experiment indicates that our result a significant improvement compared related methods problem.</p>

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Global Convergence of a Modified Liu-storey Conjugate Gradient Method

In this paper, we make a modification to the LS conjugate gradient method and propose a descent LS method. The method can generates sufficient descent direction for the objective function. We prove that the method is globally convergent with an Armijo-type line search. Moreover, under mild conditions, we show that the method is globally convergent if the Armijo line search or the Wolfe line sea...

متن کامل

A hybrid steepest descent method for constrained convex optimization

This paper describes a hybrid steepest descent method to decrease over time any given convex cost function while keeping the optimization variables into any given convex set. The method takes advantage of properties of hybrid systems to avoid the computation of projections or of a dual optimum. The convergence to a global optimum is analyzed using Lyapunov stability arguments. A discretized imp...

متن کامل

A MODIFIED STEFFENSEN'S METHOD WITH MEMORY FOR NONLINEAR EQUATIONS

In this note, we propose a modification of Steffensen's method with some free parameters. These parameters are then be used for further acceleration via the concept of with memorization. In this way, we derive a fast Steffensen-type method with memory for solving nonlinear equations. Numerical results are also given to support the underlying theory of the article.  

متن کامل

A New Descent Nonlinear Conjugate Gradient Method for Unconstrained Optimization

In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ensures the Zoutendijk condition to be held, this method is proved to be globally convergent. Finally, we improve it, and do further analysis.

متن کامل

Norm descent conjugate gradient methods for solving symmetric nonlinear equations

Nonlinear conjugate gradient method is very popular in solving large-scale unconstrained minimization problems due to its simple iterative form and lower storage requirement. In the recent years, it was successfully extended to solve higher-dimension monotone nonlinear equations. Nevertheless, the research activities on conjugate gradient method in symmetric equations are just beginning. This s...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Numerical Algebra, Control and Optimization

سال: 2022

ISSN: ['2155-3297', '2155-3289']

DOI: https://doi.org/10.3934/naco.2021022