نتایج جستجو برای: least squares with exponential forgetting

تعداد نتایج: 9289347  

Journal: :Computers & Electrical Engineering 2004
Jin Jiang Youmin Zhang

In this paper, the classical least squares (LS) and recursive least squares (RLS) for parameter estimation have been re-examined in the light of the present day computing capabilities. It has been demonstrated that for linear time-invariant systems, the performance of blockwise least squares (BLS) is always superior to that of RLS. In the context of parameter estimation for dynamic systems, the...

Journal: :IEEE Trans. Signal Processing 1995
Tarun Soni James R. Zeidler Walter H. Ku

This paper studies the performance of the aposteriori recursive least squares lattice lter in the presence of a nonstationary chirp signal. The forward and backward partial correlation (PARCOR) coe cients for a Wiener-Hopf optimal lter are shown to be complex conjugates for the general case of a nonstationary input with constant power. Such an optimal lter is compared to a minimum mean square e...

Karim Salahshoor, Mohammad Reza Jafari

An adaptive version of growing and pruning RBF neural network has been used to predict the system output and implement Linear Model-Based Predictive Controller (LMPC) and Non-linear Model-based Predictive Controller (NMPC) strategies. A radial-basis neural network with growing and pruning capabilities is introduced to carry out on-line model identification.An Unscented Kal...

Journal: :Transactions of the Society of Instrument and Control Engineers 1983

1997
Fernando Gil Vianna Resende Paulo S. R. Diniz Mineo Kaneko Akinori Nishihara

A new method for adaptive autoregressive spectral estimation based on the least-squares criterion with multi-band decomposition of the linear prediction error and analysis of each band through independent variable forgetting factors is presented. The proposed method localizes the forgetting factor adaptation scheme in the frequency domain and in the time domain, in the sense that variations on ...

2008
Christoforos Anagnostopoulos Dimitris K. Tasoulis David J. Hand Niall M. Adams

Variable selection for regression is a classical statistical problem, motivated by concerns that too many covariates invite overfitting. Existing approaches notably include a class of convex optimisation techniques, such as the Lasso algorithm. Such techniques are invariably reliant on assumptions that are unrealistic in streaming contexts, namely that the data is available off-line and the cor...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید