نتایج جستجو برای: orthogonal forward selection

تعداد نتایج: 475617  

Journal: :International journal of neural systems 2004
Xia Hong Sheng Chen Paul M. Sharkey

This paper introduces an automatic robust nonlinear identification algorithm using the leave-one-out test score also known as the PRESS (Predicted REsidual Sums of Squares) statistic and regularised orthogonal least squares. The proposed algorithm aims to achieve maximised model robustness via two effective and complementary approaches, parameter regularisation via ridge regression and model op...

2001
X. Hong

An efficient model identification algorithm for a large class of linear-in-the-parameters models is introduced that simultaneously optimises the model approximation ability, sparsity and robustness. The derived model parameters in each forward regression step are initially estimated via the orthogonal least squares (OLS), followed by being tuned with a new gradient-descent learning algorithm ba...

2011
Mehdi Torbatian

Asynchronism inherently exists in many communication systems specially in multi-terminal networks mainly due to the effect of multi-path and propagation delay. While in theoretical analysis of communication systems perfect synchronization of the terminals is often presumed, in some cases in which the nodes are randomly distributed over a geometrical area, it might be impossible to synchronize t...

2008
Yi Zhao Raviraj Adve Teng Joon Lim

A relay selection approach has previously been shown to outperform repetition-based scheduling for both amplify-and-forward (AF) and decode-and-forward (DF) cooperative networks. The selection method generally requires some feedback from the destination to the relays and the source, raising the issue of the interplay between performance and feedback rate. In this letter, we treat selection as a...

Journal: :Jurnal Nasional Teknik Elektro dan Teknologi Informasi (JNTETI) 2017

Journal: :Neurocomputing 2006
Xunxian Wang Sheng Chen David Lowe Christopher J. Harris

This paper considers sparse regression modelling using a generalised kernel model in which each kernel regressor has its individually tuned centre vector and diagonal covariance matrix. An orthogonal least squares forward selection procedure is employed to select the regressors one by one, so as to determine the model structure. After the regressor selection, the corresponding model weight para...

2006
Xia Hong Sheng Chen Christopher J. Harris

We propose a simple yet computationally efficient construction algorithm for two-class kernel classifiers. In order to optimise classifier’s generalisation capability, an orthogonal forward selection procedure is used to select kernels one by one by minimising the leave-one-out (LOO) misclassification rate directly. It is shown that the computation of the LOO misclassification rate is very effi...

Journal: :IJMIC 2006
Xunxian Wang David Lowe Sheng Chen Christopher J. Harris

A sparse regression modelling technique is developed using a generalised kernel model in which each kernel regressor has its individually tuned position (centre) vector and diagonal covariance matrix. An orthogonal least squares forward selection procedure is employed to append the regressors one by one. After the determination of the model structure, namely the selection of an appropriate numb...

2004
Xunxian Wang Sheng Chen David Brown

Sparse regression modeling is addressed using a generalized kernel model in which kernel regressor has its individually tuned position (center) vector and diagonal covariance matrix. An orthogonal least squares forward selection procedure is employed to append regressors one by one. After the determination of the model structure, namely the selection certain number of regressors, the model weig...

2008
Ying Cui Jennifer G. Dy

This paper presents a feature selection method based on the popular transformation approach: principal component analysis (PCA). It is popular because it finds the optimal solution to several objective functions (including maximum variance and minimum sum-squared-error), and also because it provides an orthogonal basis solution. However, PCA as a dimensionality reduction algorithm do not explic...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید