نتایج جستجو برای: tikhonov
تعداد نتایج: 1537 فیلتر نتایج به سال:
An explicit snake is a smooth closed curve which deforms towards the desired features in an image. There are two types of force controlling the motion of the snake: internal and external forces. The former usually constrains the snake’s curvature and tension, through first and second Tikhonov smoothness force terms, while the latter generates attraction forces. To investigate the possible role ...
Instead of the Tikhonov regularization method which with a scalar being the regularization parameter, Liu et al. [1] have proposed a novel regularization method with a vector as being the regularization parameter. As a continuation we further propose an optimally scaled vector regularization method (OSVRM) to solve the ill-posed linear problems, which is better than the Tikhonov regularization ...
The emulation of mechanical systems is a popular application of artificial neural networks in engineering. This paper examines general principles of modelling mechanical systems by feedforward artificial neural networks (FFANNs). The slow convergence issue associated with the highly parallel and redundant structure of FFANN systems is addressed by formulating criteria for constraining network p...
Tikhonov regularization of linear discrete ill-posed problems often is applied with a finite difference regularization operator that approximates a low-order derivative. These operators generally are represented by banded rectangular matrices with fewer rows than columns. They therefore cannot be applied in iterative methods that are based on the Arnoldi process, which requires the regularizati...
Straightforward solution of discrete ill-posed least-squares problems with errorcontaminated data does not, in general, give meaningful results, because propagated error destroys the computed solution. Error propagation can be reduced by imposing constraints on the computed solution. A commonly used constraint is the discrepancy principle, which bounds the norm of the computed solution when app...
In this paper we present and study a new class of regularized kernel methods for learning vector fields, which are based on filtering the spectrum of the kernel matrix. These methods include Tikhonov regularization as a special case, as well as interesting alternatives such as vector valued extensions of L2-Boosting. Our theoretical and experimental analysis shows that spectral filters that yie...
The aim of variable selection is the identification most important predictors that define response a linear system. Many techniques for use constrained least squares ( LS ) formulation in which constraint imposed 1-norm (the lasso), or 2-norm (Tikhonov regularisation), combination these norms elastic net). It always assumed must necessarily be imposed, but consequences its imposition have not b...
We consider the problem of estimating a regression function on the basis of empirical data. We use a Reproducing Kernel Hilbert Space (RKHS) as our hypothesis space, and we follow the methodology of Tikhonov regularization. We show that this leads to a learning scheme that is different from the one usually considered in Learning Theory. Subject to some regularity assumptions on the regression f...
Tikhonov regularization is a powerful tool for the solution of ill-posed linear systems and linear least squares problems. The choice of the regularization parameter is a crucial step, and many methods have been proposed for this purpose. However, eecient and reliable methods for large scale problems are still missing. In this paper approximation techniques based on the Lanczos algorithm and th...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید