Fast orthogonal least squares algorithm for efficient subset model selection

نویسندگان

  • S. Chen
  • J. Wigger
چکیده

An efficient implementation of the orthogonal least squares algorithm for subset model selection is derived in this correspondence. Computational complexity of the algorithm is examined and the result shows that this new fast orthogonal least squares algorithm significantly reduces computational requirements.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Efficient computational schemes for the orthogonal least squares algorithm

The orthogonal least squares (OM) algorithm is an efficient implementation of the forward selection method for subset model selection. The ability to find good subset parameters with only a linearly increasing computational requirement makes this method attractive lor practical implementations. In this correspondence, we examine the computational complexity of the algorithm and present a prepro...

متن کامل

Sparse model identification using orthogonal forward regression with basis pursuit and D-optimality - Control Theory and Applications, IEE Proceedings-

An efficient model identification algorithm for a large class of linear-in-the-parameters models is introduced that simultaneously optimises the model approximation ability, sparsity and robustness. The derived model parameters in each forward regression step are initially estimated via the orthogonal least squares (OLS), followed by being tuned with a new gradient-descent learning algorithm ba...

متن کامل

Locally Regularised Orthogonal Least Squares Algorithm for the Construction of Sparse Kernel Regression Models

The paper proposes to combine an orthogonal least squares (OLS) model selection with local regularisation for efficient sparse kernel data modelling. By assigning each orthogonal weight in the regression model with an individual regularisation parameter, the ability for the OLS model selection to produce a very parsimonious model with excellent generalisation performance is greatly enhanced.

متن کامل

A fast algorithm for non-negativity model selection

An efficient optimization algorithm for identifying the best least squares regression model under the condition of non-negative coefficients is proposed. The algorithm exposits an innovative solution via the unrestricted least squares and is based on the regression tree and branchand-bound techniques for computing the best subset regression. The aim is to filling a gap in computationally tracta...

متن کامل

Online Streaming Feature Selection Using Geometric Series of the Adjacency Matrix of Features

Feature Selection (FS) is an important pre-processing step in machine learning and data mining. All the traditional feature selection methods assume that the entire feature space is available from the beginning. However, online streaming features (OSF) are an integral part of many real-world applications. In OSF, the number of training examples is fixed while the number of features grows with t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Signal Processing

دوره 43  شماره 

صفحات  -

تاریخ انتشار 1995