نتایج جستجو برای: weighted least square

تعداد نتایج: 591168  

1996
Maurizio Pilu Andrew W. Fitzgibbon Robert B. Fisher

This work presents the rst direct method for specii-cally tting ellipses in the least squares sense. Previous approaches used either generic conic tting or relied on iterative methods to recover elliptic solutions. The proposed method is (i) ellipse-speciic, (ii) directly solved by a generalised eigen-system, (iii) has a desirable low-eccentricity bias, and (iv) is robust to noise. We provide a...

Journal: :IEEE Trans. Pattern Anal. Mach. Intell. 1999
Andrew W. Fitzgibbon Maurizio Pilu Robert B. Fisher

This work presents a new e cient method for tting ellipses to scattered data. Previous algorithms either tted general conics or were computationally expensive. By minimizing the algebraic distance subject to the constraint 4ac b = 1 the new method incorporates the ellipticity constraint into the normalization factor. The proposed method combines several advantages: (i) It is ellipse-speci c so ...

2014
Genevera I. Allen Jonathan Taylor Genevera I. ALLEN Jonathan TAYLOR

Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opin...

2002
Thomas Magesacher Per Ola Börjesson Per Ödling Tomas Nordström

The least-mean-square (LMS) algorithm is an adaptation scheme widely used in practice due to its simplicity. In some applications the involved signals are continuous-time. Then, usually either a fully analog implementation of the LMS algorithm is applied or the input data are sampled by analog-to-digital (AD) converters to be processed digitally. A purely digital realization is most often the p...

Journal: :Foundations of Computational Mathematics 2006
Qiang Wu Yiming Ying Ding-Xuan Zhou

This paper considers the regularized learning algorithm associated with the leastsquare loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates. The rates depend on the approximation property and the capacity of the reproducing kernel Hilbert...

2005
Armin Zeinali

Many researchers have been interested in approximation properties of fuzzy logic systems (FLS), which like neural networks can be seen as approximation schemes. Almost all of them tackled Mamdani fuzzy model, which was shown to have many interesting features. This paper aims to present alternatives for traditional inference mechanisms and CRI method. The most attractive advantage of these new m...

2003
Hervé Abdi

PLS regression is a recent technique that generalizes and combines features from principal component analysis and multiple regression. Its goal is to predict or analyze a set of dependent variables from a set of independent variables or predictors. This prediction is achieved by extracting from the predictors a set of orthogonal factors called latent variables which have the best predictive pow...

Journal: :bulletin of the iranian mathematical society 2013
q. wang g. yu

in this paper, we derive the necessary and sufficient conditions for the quaternion matrix equation xa=b to have the least-square bisymmetric solution and give the expression of such solution when the solvability conditions are met. futhermore, we consider the maximal and minimal inertias of the least-square bisymmetric solution to this equation. as applications, we derive sufficient and necess...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید