Performance Analysis of $l_0$ Norm Constrained Recursive Least Squares Algorithm
نویسندگان
چکیده
Performance analysis of l0 norm constrained Recursive least Squares (RLS) algorithm is attempted in this paper. Though the performance pretty attractive compared to its various alternatives, no thorough study of theoretical analysis has been performed. Like the popular l0 Least Mean Squares (LMS) algorithm, in l0 RLS, a l0 norm penalty is added to provide zero tap attractions on the instantaneous filter taps. A thorough theoretical performance analysis has been conducted in this paper with white Gaussian input data under assumptions suitable for many practical scenarios. An expression for steady state MSD is derived and analyzed for variations of different sets of predefined variables. Also a Taylor series expansion based approximate linear evolution of the instantaneous MSD has been performed. Finally numerical simulations are carried out to corroborate the theoretical analysis and are shown to match well for a wide range of parameters. Index Terms Adaptive filters, sparsity, l0 norm, Recursive Least Squares (RLS) algorithm, mean square deviation, performance analysis.
منابع مشابه
Superlinearly convergent exact penalty projected structured Hessian updating schemes for constrained nonlinear least squares: asymptotic analysis
We present a structured algorithm for solving constrained nonlinear least squares problems, and establish its local two-step Q-superlinear convergence. The approach is based on an adaptive structured scheme due to Mahdavi-Amiri and Bartels of the exact penalty method of Coleman and Conn for nonlinearly constrained optimization problems. The structured adaptation also makes use of the ideas of N...
متن کاملRecursive Algorithm for L1 Norm Estimation in Linear Models
L1 norm estimator has been widely used as a robust parameter estimation method for outlier detection. Different algorithms have been applied for L1 norm minimization among which the linear programming problem based on the simplex method is well known. In the present contribution, in order to solve an L1 norm minimization problem in a linear model, an interior point algorithm is developed which ...
متن کاملAdaptive Estimation of Sparse Signals : where RLS meets the l 1 - norm †
Using the l1-norm to regularize the least-squares criterion, the batch least-absolute shrinkage and selection operator (Lasso) has well-documented merits for estimating sparse signals of interest emerging in various applications where observations adhere to parsimonious linear regression models. To cope with high complexity, increasing memory requirements, and lack of tracking capability that b...
متن کاملLinearly-constrained line-search algorithm for adaptive filtering
We develop a linearly-constrained line-search adaptive filtering algorithm by incorporating the linear constraints into the least squares problem and searching the solution (filter weights) along the Kalman gain vector. The proposed algorithm performs close to the constrained recursive least squares (CRLS) algorithm while having a computational complexity comparable to the constrained least mea...
متن کاملPerformance Analysis of $l_0$ Norm Constraint Least Mean Square Algorithm
As one of the recently proposed algorithms for sparse system identification, l0 norm constraint Least Mean Square (l0-LMS) algorithm modifies the cost function of the traditional method with a penalty of tap-weight sparsity. The performance of l0-LMS is quite attractive compared with its various precursors. However, there has been no detailed study of its performance. This paper presents compre...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1602.03283 شماره
صفحات -
تاریخ انتشار 2016