نتایج جستجو برای: squares criterion
تعداد نتایج: 125767 فیلتر نتایج به سال:
We consider the problem of reconstructing an unknown function f on a domain X from samples of f at n randomly chosen points with respect to a given measure ρX . Given a sequence of linear spaces (Vm)m>0 with dim(Vm) = m ≤ n, we study the least squares approximations from the spaces Vm. It is well known that such approximations can be inaccurate when m is too close to n, even when the samples ar...
We investigate properties and numerical algorithms for Aand D-optimal regression designs based on the second-order least squares estimator (SLSE). Several theoretical results are derived, including an innovative expression to characterize the A-optimality criterion. We can formulate the optimal design problems under SLSE as semidefinite programming or convex optimization problems and show that ...
In this work, we propose a continuous-domain stochastic model that can be applied to image data. This model is autoregressive, and accounts for Gaussian-type as well as for non-Gaussian-type innovations. In order to estimate the corresponding parameters from the data, we introduce two possible error criteria; namely, Gaussian maximum-likelihood, and least-squares autocorrelation fit. Exploiting...
In this paper, we consider the problem of finding the Least Squares estimators of two isotonic regression curves g◦ 1 and g◦ 2 under the additional constraint that they are ordered; e.g., g◦ 1 ≤ g◦ 2 . Given two sets of n data points y1, .., yn and z1, . . . , zn observed at (the same) design points, the estimates of the true curves are obtained by minimizing the weighted Least Squares criterio...
This paper addresses the problem of online quality prediction in processes with multiple operating modes. The paper proposes a new method called mixture of partial least squares regression (Mix-PLS), where the solution of the mixture of experts regression is performed using the partial least squares (PLS) algorithm. The PLS is used to tune the model experts and the gate parameters. The solution...
Given an affine algebraic variety V over R with compact set V (R) of real points, and a non-negative polynomial function f ∈ R[V ] with finitely many real zeros, we establish a local-global criterion for f to be a sum of squares in R[V ]. We then specialize to the case where V is a curve. The notion of virtual compactness is introduced, and it is shown that in the localglobal principle, compact...
This paper develops asymptotic properties of a class of sign-error algorithms with expanding truncation bounds for adaptive filtering. Under merely stationary ergodicity and finite second moments of the reference and output signals, and using trajectory-subsequence (TS) method, it is proved that the algorithm convergers almost surely. Then, a mean squares estimate is derived for the estimation ...
Methods of incorporating a ridge type of regularization into partial redundancy analysis (PRA), constrained redundancy analysis (CRA), and partial and constrained redundancy analysis (PCRA) were discussed. The usefulness of ridge estimation in reducing MSE (mean square error) has been recognized in multiple regression analysis for some time, especially when predictor variables are nearly collin...
This paper contrasts three related regularization schemes for kernel machines using a least squares criterion, namely Tikhonov and Ivanov regularization and Morozov’s discrepancy principle. We derive the conditions for optimality in a least squares support vector machine context (LS-SVMs) where they differ in the role of the regularization parameter. In particular, the Ivanov and Morozov scheme...
We present a subspace-based variant of LS-SVMs (i.e. regularization networks) that sequentially processes the data and is hence especially suited for online learning tasks. The algorithm works by selecting from the data set a small subset of basis functions that is subsequently used to approximate the full kernel on arbitrary points. This subset is identified online from the data stream. We imp...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید