نتایج جستجو برای: least mean squares method

تعداد نتایج: 2408290  

2004
Wijnand Hoitinga

vii List of Figures x List of Tables xi I Direct Minimization of the Equation Residuals 1

Journal: :Signal Processing 2009
Shengkui Zhao Zhihong Man Suiyang Khoo Hong Ren Wu

An improved robust variable step-size least mean square (LMS) algorithm is developed in this paper. Unlike many existing approaches, we adjust the variable step-size using a quotient form of filtered versions of the quadratic error. The filtered estimates of the error are based on exponential windows, applying different decaying factors for the estimations in the numerator and denominator. The ...

Journal: :Signal Processing 2006
Jerónimo Arenas-García Manel Martínez-Ramón Ángel Navia-Vázquez Aníbal R. Figueiras-Vidal

For least mean-square (LMS) algorithm applications, it is important to improve the speed of convergence vs the residual error trade-off imposed by the selection of a certain value for the step size. In this paper, we propose to use a mixture approach, adaptively combining two independent LMS filters with large and small step sizes to obtain fast convergence with low misadjustment during station...

2009
Jiankan Yang

AbslractThis paper studies the effect of array calibration errors on the performance of various direction 6nding @F) based signal copy algorithms. Unlike blind copy me€hods, this class of algorithms requires an estimate of the directions of arrival (DOA’s) of the signals in order to compute the copy weight vectors. Under the assumption that the observation time is sufficiently long, the followi...

1996
Marc Moonen

In this paper, first a brief review is given of a fully pipelined algorithm for recursive least squares (RLS) estimation, based on socalled ‘inverse updating’. Then a specific class of (block) RLS algorithms is considered, which embraces normalized LMS as a special case (with block size equal to one). It is shown that such algorithms may be cast in the ‘inverse-updating RLS’ framework. This all...

Journal: :CoRR 2014
Azam Khalili Amir Rastegarnia

In this paper we study the impact of network size on the performance of incremental least mean square (ILMS) adaptive networks. Specifically, we consider two ILMS networks with different number of nodes and compare their performance in two different cases including (i) ideal links and (ii) noisy links. We show that when the links between nodes are ideal, increasing the network size improves the...

Journal: :IEEE Trans. Signal Processing 1997
Wen-Rong Wu Po-Cheng Chen

Autoregressive (AR) modeling is widely used in signal processing. The coefficients of an AR model can be easily obtained with a least mean square (LMS) prediction error filter. However, it is known that this filter gives a biased solution when the input signal is corrupted by white Gaussian noise. Treichler suggested the -LMS algorithm to remedy this problem and proved that the mean weight vect...

1997
Yongbin Wei Saul B. Gelfand James V. Krogmeier

In many identi cation and tracking problems, an accurate estimate of the measurement noise variance is available. A partially adaptive LMS-type algorithm is developed which can exploit this information while maintaining the simplicity and robustness of LMS. This noise constrained LMS (NCLMS) algorithm is a type of variable step-size LMS algorithm, which is derived by adding constraints to the m...

پایان نامه :وزارت علوم، تحقیقات و فناوری - دانشگاه تربیت مدرس 1386

چکیده ندارد.

Journal: :IEEE Trans. Information Theory 1984
Bernard Widrow Eugene Walach

A fundamental relationship exists between the quality of an adaptive solution and the amount of data used in obtaining it. Quality is defined here in terms of “misadjustment,” the ratio of the excess mean square error (mse) in an adaptive solution to the min imum possible mse. The higher the misadjustment, the lower the quality is. The quality of the exact least squares solution is compared wit...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید