Sparse kernel regression modeling using combined locally regularized orthogonal least squares and D-optimality experimental design
نویسندگان
چکیده
The note proposes an efficient nonlinear identification algorithm by combining a locally regularized orthogonal least squares (LROLS) model selection with a D-optimality experimental design. The proposed algorithm aims to achieve maximized model robustness and sparsity via two effective and complementary approaches. The LROLS method alone is capable of producing a very parsimonious model with excellent generalization performance. The D-optimality design criterion further enhances the model efficiency and robustness. An added advantage is that the user only needs to specify a weighting for the D-optimality cost in the combined model selecting criterion and the entire model construction procedure becomes automatic. The value of this weighting does not influence the model selection procedure critically and it can be chosen with ease from a wide range of values.
منابع مشابه
Sparse Multi-Output Radial Basis Function Network Construction Using Combined Locally Regularized Orthogonal Least Square and D-Optimality Experimental Design
A new construction algorithm for multi-output radial basis function (RBF) network modelling is introduce by combining a locally regularized orthogonal least squares (LROLS) model selection with a D-optimality experimental design. The proposed algorithm aims to achieve maximized model robustness and sparsity via two effective and complementary approaches. The LROLS method alone is capable of pro...
متن کاملSparse multioutput radial basis function network construction using combined locally regularised orthogonal least square and D-optimality experimental des - Control Theory and Applications, IEE Proceedings-
A construction algorithm for multioutput radial basis function (RBF) network modelling is introduced by combining a locally regularised orthogonal least squares (LROLS) model selection with a D-optimality experimental design. The proposed algorithm aims to achieve maximised model robustness and sparsity via two effective and complementary approaches. The LROLS method alone is capable of produci...
متن کاملLocally Regularised Orthogonal Least Squares Algorithm for the Construction of Sparse Kernel Regression Models
The paper proposes to combine an orthogonal least squares (OLS) model selection with local regularisation for efficient sparse kernel data modelling. By assigning each orthogonal weight in the regression model with an individual regularisation parameter, the ability for the OLS model selection to produce a very parsimonious model with excellent generalisation performance is greatly enhanced.
متن کاملParsimonious Support Vector Regression using Orthogonal Forward Selection with the Generalized Kernel Model
Sparse regression modeling is addressed using a generalized kernel model in which kernel regressor has its individually tuned position (center) vector and diagonal covariance matrix. An orthogonal least squares forward selection procedure is employed to append regressors one by one. After the determination of the model structure, namely the selection certain number of regressors, the model weig...
متن کاملRobust nonlinear model identification methods using forward regression
In this correspondence new robust nonlinear model construction algorithms for a large class of linear-in-the-parameters models are introduced to enhance model robustness via combined parameter regularization and new robust structural selective criteria. In parallel to parameter regularization, we use two classes of robust model selection criteria based on either experimental design criteria tha...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- IEEE Trans. Automat. Contr.
دوره 48 شماره
صفحات -
تاریخ انتشار 2003