نتایج جستجو برای: least squares problems
تعداد نتایج: 949960 فیلتر نتایج به سال:
X. Cui Department of Informatics, School of Multidisciplinary Sciences, The Graduate University for Advanced Studies (Sokendai), 2-1-2, Hitotsubashi, Chiyoda-ku, Tokyo, Japan, 101-8430 K. Hayami Principles of Informatics Research Division, National Institute of Informatics, 2-1-2, Hitotsubashi, Chiyodaku, Tokyo, Japan, 101-8430 J. Yin Department of Mathematics, Tongji University, Shanghai, P.R....
It is well known that the GaussNewton algorithm for solving nonlinear least squares problems is a special case of the scoring algorithm for maximizing log likelihoods. What has received less attention is that the computation of the current correction in the scoring algorithm in both its line search and trust region forms can be cast as a linear least squares problem. This is an important observ...
This paper studies the normwise perturbation theory for structured least squares problems. The structures under investigation are symmetric, persymmetric, skewsymmetric, Toeplitz and Hankel. We present the condition numbers for structured least squares. AMS subject classification (2000): 15A18, 65F20, 65F25, 65F50.
Abstract — We show in this paper that many different least-squares problems which have applications in signal processing may be seen as special cases of a more general vector space minimization problem called the Minimum Norm problem. We show that special cases of the Minimum Norm problem include: least squares fitting of a finite set of points to a linear equation and to a quadratic equation; ...
This paper analyzes linear least squares problems with absolute quadratic constraints. We develop a generalized theory following Bookstein’s conic-fitting and Fitzgibbon’s direct ellipse-specific fitting. Under simple preconditions, it can be shown that a minimum always exists and can be determined by a generalized eigenvalue problem. This problem is numerically reduced to an eigenvalue problem...
The standard iterative method for solving large sparse least squares problems min ∈Rn ‖ −A ‖2, A ∈ Rm×n is the CGLS method, or its stabilized version LSQR, which applies the (preconditioned) conjugate gradient method to the normal equation ATA = AT . In this paper, we will consider alternative methods using a matrix B ∈ Rn×m and applying the Generalized Minimal Residual (GMRES) method to min ∈R...
In this note we analyze the in uence of the regularization procedure applied to singular LS square problems It appears that due to nite numerical accuracy within the computer calculations the regularization parameter has to belong to a particular range of values in order to have the regularized solution close to that associated to the singular LS problem Surprisingly enough this range essential...
This paper presents two reformulations of the dual of the constrained least squares problem over convex cones. In addition, it extends Nesterov’s excessive gap method 1 [21] to more general problems. The conic least squares problem is then solved by applying the resulting modified method, or Nesterov’s smooth method [22], or Nesterov’s excessive gap method 2 [21], to the dual reformulations. Nu...
The standard approaches to solving overdetermined linear systems Ax ≈ b construct minimal corrections to the vector b and/or the matrix A such that the corrected system is compatible. In ordinary least squares (LS) the correction is restricted to b, while in data least squares (DLS) it is restricted to A. In scaled total least squares (Scaled TLS) [15], corrections to both b and A are allowed, ...
Usually generalized least squares problems are solved by transforming them into regular least squares problems which can then be solved by well-known numerical methods. However, this approach is not very effective in some cases and, besides, is very expensive for large scale problems. In 1979, Paige suggested another approach which consists of solving an equivalent equality-constrained least sq...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید