Gauss–Newton–Secant Method for Solving Nonlinear Least Squares Problems under Generalized Lipschitz Conditions

نویسندگان

چکیده

We develop a local convergence of an iterative method for solving nonlinear least squares problems with operator decomposition under the classical and generalized Lipschitz conditions. consider case both zero nonzero residuals determine their orders. use two types conditions (center restricted region conditions) to study method. Moreover, we obtain larger radius tighter error estimates than in previous works. Hence, extend applicability this same computational effort.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Randomized Algorithms for Solving Large Scale Nonlinear Least Squares Problems

This thesis presents key contributions towards devising highly efficient stochastic reconstruction algorithms for solving large scale inverse problems, where a large data set is available and the underlying physical systems is complex, e.g., modeled by partial differential equations (PDEs). We begin by developing stochastic and deterministic dimensionality reduction methods to transform the ori...

متن کامل

Newton-Krylov Type Algorithm for Solving Nonlinear Least Squares Problems

The minimization of a quadratic function within an ellipsoidal trust region is an important subproblem for many nonlinear programming algorithms. When the number of variables is large, one of the most widely used strategies is to project the original problem into a small dimensional subspace. In this paper, we introduce an algorithm for solving nonlinear least squares problems. This algorithm i...

متن کامل

Supervised Descent Method for Solving Nonlinear Least Squares Problems in Computer Vision

Many computer vision problems (e.g., camera calibration, image alignment, structure from motion) are solved with nonlinear optimization methods. It is generally accepted that second order descent methods are the most robust, fast, and reliable approaches for nonlinear optimization of a general smooth function. However, in the context of computer vision, second order descent methods have two mai...

متن کامل

Generalized Nonlinear Inverse Problems Solved Using the Least Squares Criterion

The aim of physical sciences is to discover the minimal set of parameters which completely describe physical systems and the laws relating the values of these parameters to the results of any set of measurements on the system. A coherent set of such laws is named a physical theory. To the extent that the values of the parameters can only be obtained as a results of measurements, one may equival...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Axioms

سال: 2021

ISSN: ['2075-1680']

DOI: https://doi.org/10.3390/axioms10030158